url
string
repository_url
string
labels_url
string
comments_url
string
events_url
string
html_url
string
id
int64
node_id
string
number
int64
title
string
user
dict
labels
list
state
string
locked
bool
assignee
dict
assignees
list
milestone
null
comments
list
created_at
timestamp[ms]
updated_at
timestamp[ms]
closed_at
timestamp[ms]
author_association
string
type
dict
active_lock_reason
null
draft
bool
pull_request
dict
body
string
closed_by
dict
reactions
dict
timeline_url
string
performed_via_github_app
null
state_reason
string
sub_issues_summary
dict
issue_dependencies_summary
dict
is_pull_request
bool
is_closed
bool
https://api.github.com/repos/huggingface/transformers/issues/37212
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37212/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37212/comments
https://api.github.com/repos/huggingface/transformers/issues/37212/events
https://github.com/huggingface/transformers/pull/37212
2,966,869,594
PR_kwDOCUB6oc6RHZ1O
37,212
Fix small bug in PaliGemma demo docs
{ "login": "EricCousineau-TRI", "id": 26719449, "node_id": "MDQ6VXNlcjI2NzE5NDQ5", "avatar_url": "https://avatars.githubusercontent.com/u/26719449?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EricCousineau-TRI", "html_url": "https://github.com/EricCousineau-TRI", "followers_url": "https://api.github.com/users/EricCousineau-TRI/followers", "following_url": "https://api.github.com/users/EricCousineau-TRI/following{/other_user}", "gists_url": "https://api.github.com/users/EricCousineau-TRI/gists{/gist_id}", "starred_url": "https://api.github.com/users/EricCousineau-TRI/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EricCousineau-TRI/subscriptions", "organizations_url": "https://api.github.com/users/EricCousineau-TRI/orgs", "repos_url": "https://api.github.com/users/EricCousineau-TRI/repos", "events_url": "https://api.github.com/users/EricCousineau-TRI/events{/privacy}", "received_events_url": "https://api.github.com/users/EricCousineau-TRI/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T15:52:02
2025-04-02T16:24:44
2025-04-02T16:24:44
NONE
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37212", "html_url": "https://github.com/huggingface/transformers/pull/37212", "diff_url": "https://github.com/huggingface/transformers/pull/37212.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37212.patch", "merged_at": null }
# What does this PR do? Fixes bug in Paligemma usage example Fixes #37181 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. \cc @zucchini-nlp
{ "login": "EricCousineau-TRI", "id": 26719449, "node_id": "MDQ6VXNlcjI2NzE5NDQ5", "avatar_url": "https://avatars.githubusercontent.com/u/26719449?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EricCousineau-TRI", "html_url": "https://github.com/EricCousineau-TRI", "followers_url": "https://api.github.com/users/EricCousineau-TRI/followers", "following_url": "https://api.github.com/users/EricCousineau-TRI/following{/other_user}", "gists_url": "https://api.github.com/users/EricCousineau-TRI/gists{/gist_id}", "starred_url": "https://api.github.com/users/EricCousineau-TRI/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EricCousineau-TRI/subscriptions", "organizations_url": "https://api.github.com/users/EricCousineau-TRI/orgs", "repos_url": "https://api.github.com/users/EricCousineau-TRI/repos", "events_url": "https://api.github.com/users/EricCousineau-TRI/events{/privacy}", "received_events_url": "https://api.github.com/users/EricCousineau-TRI/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37212/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37212/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37211
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37211/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37211/comments
https://api.github.com/repos/huggingface/transformers/issues/37211/events
https://github.com/huggingface/transformers/pull/37211
2,966,761,912
PR_kwDOCUB6oc6RHCBg
37,211
Allow flexible generation params arg when checking pipeline specs
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T15:21:43
2025-04-03T12:29:38
2025-04-03T12:29:36
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37211", "html_url": "https://github.com/huggingface/transformers/pull/37211", "diff_url": "https://github.com/huggingface/transformers/pull/37211.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37211.patch", "merged_at": "2025-04-03T12:29:36" }
We currently check the input and output signatures of our Pipeline classes against the specs in `huggingface_hub`, but this makes it hard for the Hub to change without breaking our CI. This PR allows them to change their spec from `generate_kwargs` to `generation_parameters` without causing failures at our end.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37211/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37211/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37210
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37210/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37210/comments
https://api.github.com/repos/huggingface/transformers/issues/37210/events
https://github.com/huggingface/transformers/pull/37210
2,966,702,303
PR_kwDOCUB6oc6RG0hg
37,210
add fast image processor for pix2struct
{ "login": "zhouksh", "id": 3754366, "node_id": "MDQ6VXNlcjM3NTQzNjY=", "avatar_url": "https://avatars.githubusercontent.com/u/3754366?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhouksh", "html_url": "https://github.com/zhouksh", "followers_url": "https://api.github.com/users/zhouksh/followers", "following_url": "https://api.github.com/users/zhouksh/following{/other_user}", "gists_url": "https://api.github.com/users/zhouksh/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhouksh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhouksh/subscriptions", "organizations_url": "https://api.github.com/users/zhouksh/orgs", "repos_url": "https://api.github.com/users/zhouksh/repos", "events_url": "https://api.github.com/users/zhouksh/events{/privacy}", "received_events_url": "https://api.github.com/users/zhouksh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-04-02T15:06:23
2025-04-22T14:36:17
null
NONE
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37210", "html_url": "https://github.com/huggingface/transformers/pull/37210", "diff_url": "https://github.com/huggingface/transformers/pull/37210.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37210.patch", "merged_at": null }
# What does this PR do? add fast image processor for pix2struct <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37210/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37210/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37209
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37209/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37209/comments
https://api.github.com/repos/huggingface/transformers/issues/37209/events
https://github.com/huggingface/transformers/pull/37209
2,966,616,732
PR_kwDOCUB6oc6RGhHH
37,209
Stop DOSing the Hub in the CI
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T14:44:11
2025-04-02T16:19:35
2025-04-02T16:19:34
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37209", "html_url": "https://github.com/huggingface/transformers/pull/37209", "diff_url": "https://github.com/huggingface/transformers/pull/37209.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37209.patch", "merged_at": "2025-04-02T16:19:34" }
Several of our test suites use `setUp()` methods, but these are called **once per test**. This repeatedly loads the same files from the Hub when the test suite has lots of small tests, and this increases the risk of a single connection failure giving us a red CI, and also seems very likely to trigger DOS protection from the Hub. I'm experimenting with refactoring some tests so that Hub operations are only called once for the whole test class
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37209/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37209/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37208
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37208/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37208/comments
https://api.github.com/repos/huggingface/transformers/issues/37208/events
https://github.com/huggingface/transformers/pull/37208
2,966,436,024
PR_kwDOCUB6oc6RF54J
37,208
fix gemma3 grad acc
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T13:41:27
2025-06-25T14:28:47
2025-06-25T14:28:45
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37208", "html_url": "https://github.com/huggingface/transformers/pull/37208", "diff_url": "https://github.com/huggingface/transformers/pull/37208.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37208.patch", "merged_at": "2025-06-25T14:28:45" }
# What does this PR do? This PR fixes the grad acc issue with gemma3 model. The issue was that we passed **kwargs in the model forward, so we were making the assumption that he was passing `**loss_kwargs` -> `num_items_in_batch` to calculate the loss. Not sure what is the best way to fix this @ArthurZucker in general as this might probably happen again. Maybe set `accepts_loss_kwargs` to `False` in general and set it to `True` for models that we fixed ? I'm fine also just setting it `False` for models that don't use the `kwargs` for the loss. As for why I didn't have the loss function: In the code, they are filetring the logits/labels so I decided to simply not use `num_items_in_batch` to calculate the loss. Otherwise, the loss won't be correctly calculated for one of the cases. Also I fixed an issue related to peft as we couldn't have access to that attribute as the model was a peft model. #### To reproduce winglian script https://gist.github.com/winglian/569924fe154824c8ce148f6e185cd4cd ### After fix grad acc 2 bs 1 and grad acc 1 bs 2 <img width="548" alt="Screenshot 2025-04-02 at 4 38 54 PM" src="https://github.com/user-attachments/assets/8d8dc273-4851-4c37-a33c-933ec54c47d1" /> Fixes #37197 cc @winglian
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37208/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 2, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37208/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37207
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37207/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37207/comments
https://api.github.com/repos/huggingface/transformers/issues/37207/events
https://github.com/huggingface/transformers/pull/37207
2,966,337,596
PR_kwDOCUB6oc6RFkfJ
37,207
update error msg
{ "login": "itazap", "id": 31893021, "node_id": "MDQ6VXNlcjMxODkzMDIx", "avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4", "gravatar_id": "", "url": "https://api.github.com/users/itazap", "html_url": "https://github.com/itazap", "followers_url": "https://api.github.com/users/itazap/followers", "following_url": "https://api.github.com/users/itazap/following{/other_user}", "gists_url": "https://api.github.com/users/itazap/gists{/gist_id}", "starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/itazap/subscriptions", "organizations_url": "https://api.github.com/users/itazap/orgs", "repos_url": "https://api.github.com/users/itazap/repos", "events_url": "https://api.github.com/users/itazap/events{/privacy}", "received_events_url": "https://api.github.com/users/itazap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T13:05:37
2025-04-04T08:21:32
2025-04-04T08:21:30
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37207", "html_url": "https://github.com/huggingface/transformers/pull/37207", "diff_url": "https://github.com/huggingface/transformers/pull/37207.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37207.patch", "merged_at": "2025-04-04T08:21:30" }
change error msg #36291
{ "login": "itazap", "id": 31893021, "node_id": "MDQ6VXNlcjMxODkzMDIx", "avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4", "gravatar_id": "", "url": "https://api.github.com/users/itazap", "html_url": "https://github.com/itazap", "followers_url": "https://api.github.com/users/itazap/followers", "following_url": "https://api.github.com/users/itazap/following{/other_user}", "gists_url": "https://api.github.com/users/itazap/gists{/gist_id}", "starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/itazap/subscriptions", "organizations_url": "https://api.github.com/users/itazap/orgs", "repos_url": "https://api.github.com/users/itazap/repos", "events_url": "https://api.github.com/users/itazap/events{/privacy}", "received_events_url": "https://api.github.com/users/itazap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37207/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37207/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37206
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37206/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37206/comments
https://api.github.com/repos/huggingface/transformers/issues/37206/events
https://github.com/huggingface/transformers/pull/37206
2,966,291,494
PR_kwDOCUB6oc6RFasj
37,206
Fix `max_length_q` and `max_length_k` types to `flash_attn_varlen_func`
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T12:49:27
2025-07-09T21:12:39
2025-07-09T21:12:39
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37206", "html_url": "https://github.com/huggingface/transformers/pull/37206", "diff_url": "https://github.com/huggingface/transformers/pull/37206.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37206.patch", "merged_at": "2025-07-09T21:12:39" }
Also add notes asking users to set `TORCHDYNAMO_CAPTURE_SCALAR_OUTPUTS=1` or call `torch._dynamo.config.capture_scalar_outputs = True`, as currently this will cause a graph break. # What does this PR do? Fix `max_length_q` and `max_length_k` types to `flash_attn_varlen_func`. Also add notes asking users to set `TORCHDYNAMO_CAPTURE_SCALAR_OUTPUTS=1` or call `torch._dynamo.config.capture_scalar_outputs = True`, as currently this will cause a graph break. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #35588 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 --> @ArthurZucker
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37206/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/37206/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37205
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37205/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37205/comments
https://api.github.com/repos/huggingface/transformers/issues/37205/events
https://github.com/huggingface/transformers/pull/37205
2,966,276,904
PR_kwDOCUB6oc6RFXfc
37,205
Skip flaky test counting calls to Hub
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T12:44:04
2025-04-02T13:22:30
2025-04-02T13:22:30
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37205", "html_url": "https://github.com/huggingface/transformers/pull/37205", "diff_url": "https://github.com/huggingface/transformers/pull/37205.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37205.patch", "merged_at": null }
This test works locally for me but seems very flaky on the CI. I'm going to disable it for now until we can properly investigate why! Also going to remove the TF version entirely.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37205/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37205/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37204
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37204/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37204/comments
https://api.github.com/repos/huggingface/transformers/issues/37204/events
https://github.com/huggingface/transformers/pull/37204
2,966,124,102
PR_kwDOCUB6oc6RE1vN
37,204
Add Fast PVT Processor
{ "login": "keetrap", "id": 103131112, "node_id": "U_kgDOBiWn6A", "avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keetrap", "html_url": "https://github.com/keetrap", "followers_url": "https://api.github.com/users/keetrap/followers", "following_url": "https://api.github.com/users/keetrap/following{/other_user}", "gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}", "starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keetrap/subscriptions", "organizations_url": "https://api.github.com/users/keetrap/orgs", "repos_url": "https://api.github.com/users/keetrap/repos", "events_url": "https://api.github.com/users/keetrap/events{/privacy}", "received_events_url": "https://api.github.com/users/keetrap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T11:47:05
2025-04-23T19:55:21
2025-04-23T19:55:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37204", "html_url": "https://github.com/huggingface/transformers/pull/37204", "diff_url": "https://github.com/huggingface/transformers/pull/37204.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37204.patch", "merged_at": "2025-04-23T19:55:20" }
Related #36978 cc @yonigozlan
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37204/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37204/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37203
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37203/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37203/comments
https://api.github.com/repos/huggingface/transformers/issues/37203/events
https://github.com/huggingface/transformers/pull/37203
2,966,010,999
PR_kwDOCUB6oc6REc3Y
37,203
Add Fast Image Processor for LayoutLMv2
{ "login": "rootonchair", "id": 23548268, "node_id": "MDQ6VXNlcjIzNTQ4MjY4", "avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rootonchair", "html_url": "https://github.com/rootonchair", "followers_url": "https://api.github.com/users/rootonchair/followers", "following_url": "https://api.github.com/users/rootonchair/following{/other_user}", "gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}", "starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions", "organizations_url": "https://api.github.com/users/rootonchair/orgs", "repos_url": "https://api.github.com/users/rootonchair/repos", "events_url": "https://api.github.com/users/rootonchair/events{/privacy}", "received_events_url": "https://api.github.com/users/rootonchair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T10:57:07
2025-04-15T18:04:53
2025-04-14T13:06:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37203", "html_url": "https://github.com/huggingface/transformers/pull/37203", "diff_url": "https://github.com/huggingface/transformers/pull/37203.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37203.patch", "merged_at": "2025-04-14T13:06:41" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Related #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37203/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37203/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37202
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37202/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37202/comments
https://api.github.com/repos/huggingface/transformers/issues/37202/events
https://github.com/huggingface/transformers/pull/37202
2,965,865,979
PR_kwDOCUB6oc6RD9OQ
37,202
Try to avoid/reduce some remaining CI job failures
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T09:56:12
2025-04-02T12:39:59
2025-04-02T12:39:57
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37202", "html_url": "https://github.com/huggingface/transformers/pull/37202", "diff_url": "https://github.com/huggingface/transformers/pull/37202.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37202.patch", "merged_at": "2025-04-02T12:39:57" }
# What does this PR do? Similar to #37170, another try to avoid/reduce some remaining CI job failures
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37202/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37202/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37201
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37201/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37201/comments
https://api.github.com/repos/huggingface/transformers/issues/37201/events
https://github.com/huggingface/transformers/pull/37201
2,965,755,194
PR_kwDOCUB6oc6RDk_a
37,201
Add Fast Image Processor for LayoutLMv3
{ "login": "rootonchair", "id": 23548268, "node_id": "MDQ6VXNlcjIzNTQ4MjY4", "avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rootonchair", "html_url": "https://github.com/rootonchair", "followers_url": "https://api.github.com/users/rootonchair/followers", "following_url": "https://api.github.com/users/rootonchair/following{/other_user}", "gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}", "starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions", "organizations_url": "https://api.github.com/users/rootonchair/orgs", "repos_url": "https://api.github.com/users/rootonchair/repos", "events_url": "https://api.github.com/users/rootonchair/events{/privacy}", "received_events_url": "https://api.github.com/users/rootonchair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T09:15:55
2025-04-15T18:04:19
2025-04-14T13:42:12
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37201", "html_url": "https://github.com/huggingface/transformers/pull/37201", "diff_url": "https://github.com/huggingface/transformers/pull/37201.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37201.patch", "merged_at": "2025-04-14T13:42:12" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Related #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37201/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37201/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37199
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37199/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37199/comments
https://api.github.com/repos/huggingface/transformers/issues/37199/events
https://github.com/huggingface/transformers/issues/37199
2,965,516,901
I_kwDOCUB6oc6wwjJl
37,199
torch.compile graph break when tuning llama with FA2
{ "login": "SilverSoldier", "id": 13083399, "node_id": "MDQ6VXNlcjEzMDgzMzk5", "avatar_url": "https://avatars.githubusercontent.com/u/13083399?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SilverSoldier", "html_url": "https://github.com/SilverSoldier", "followers_url": "https://api.github.com/users/SilverSoldier/followers", "following_url": "https://api.github.com/users/SilverSoldier/following{/other_user}", "gists_url": "https://api.github.com/users/SilverSoldier/gists{/gist_id}", "starred_url": "https://api.github.com/users/SilverSoldier/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SilverSoldier/subscriptions", "organizations_url": "https://api.github.com/users/SilverSoldier/orgs", "repos_url": "https://api.github.com/users/SilverSoldier/repos", "events_url": "https://api.github.com/users/SilverSoldier/events{/privacy}", "received_events_url": "https://api.github.com/users/SilverSoldier/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-02T08:01:37
2025-05-11T08:03:04
2025-05-11T08:03:04
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.50.3 - Platform: Linux-5.14.0-284.73.1.el9_2.x86_64-x86_64-with-glibc2.31 - Python version: 3.12.9 - Huggingface_hub version: 0.29.3 - Safetensors version: 0.5.3 - Accelerate version: 1.0.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (GPU?): 2.6.0+cu124 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: No - Using GPU in script?: Yes - GPU type: NVIDIA A100-SXM4-80GB ### Who can help? I am getting 2 graph breaks when tuning llama3-8b model with torch.compile. Both are failing due to Dynamic Control Flow issues with the following error message: ``` Dynamic control flow is not supported at the moment. Please use functorch.experimental.control_flow.cond to explicitly capture the control flow. ``` The thing is, both of these lines were actually last touched by PRs to remove graph breaks, so clearly something changed in pytorch compile that is breaking them again. 1. `if attention_mask is not None and (attention_mask == 0.0).any()` in [modeling_llama](https://github.com/huggingface/transformers/blob/800510c67bfc5cedd0bb7635648a07f39719be43/src/transformers/models/llama/modeling_llama.py#L637) This was fixed just 4 months ago in [this PR](https://github.com/huggingface/transformers/pull/35187) 3. `max_length_q is not None or (query_length != 1 and not (torch.diff(position_ids, dim=-1) >= 0).all())` [in modeling_flash_attention_utils](https://github.com/huggingface/transformers/blame/800510c67bfc5cedd0bb7635648a07f39719be43/src/transformers/modeling_flash_attention_utils.py#L378) This was fixed as part of [this PR](https://github.com/huggingface/transformers/pull/33932) 6 months ago. The error message suggests to use `torch.cond` but it is not trivial to use for both these cases. I attempted to fix #1, but the return values of the if and else branch are None and attention_mask respectively while `cond` expects the outputs of the 2 branches to be of same type and shape. For #2 it is the `torch.diff` clause which is causing the problem. I haven't looked closely enough but the multiple condition checks might make clean separation into `cond` syntax slightly difficult. ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ``` accelerate launch --num_processes=1 -m tuning.sft_trainer --output_dir ./train_output --max_steps 5 --learning_rate 2e-5 --training_data_path=/data/data/ei_5.jsonl --save_steps=50 --torch_dtype bfloat16 --logging_strategy steps --logging_steps 1 --per_device_train_batch_size 8 --max_seq_length 1024 --include_tokens_per_second true --data_formatter_template "### Input: {input} \n\n### Response: {output}" --response_template "\n### Response:" --torch_compile True --model_name_or_path /data/models/llama3-8b --use_flash_attn True --packing ``` Checked with pytorch 2.5 and 2.6, with and without padding_free with no change in graph breaks ### Expected behavior Expected no graph breaks. Curious if anyone knows why this is popping up now and how to fix it, is it only through `cond` or is there some other way (I am willing to fix but not sure how). Thanks!
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37199/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37199/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37198
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37198/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37198/comments
https://api.github.com/repos/huggingface/transformers/issues/37198/events
https://github.com/huggingface/transformers/pull/37198
2,965,500,122
PR_kwDOCUB6oc6RCtW_
37,198
enable 2 types of case on XPU
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T07:55:43
2025-04-07T03:44:48
2025-04-03T09:37:55
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37198", "html_url": "https://github.com/huggingface/transformers/pull/37198", "diff_url": "https://github.com/huggingface/transformers/pull/37198.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37198.patch", "merged_at": "2025-04-03T09:37:55" }
enable 2 types of case on XPU: 1. test_resize_tokens_embeddings_with_deepspeed_multi_gpu 2. test_resize_embeddings_untied_with_deepspeed_multi_gpu
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37198/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37198/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37197
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37197/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37197/comments
https://api.github.com/repos/huggingface/transformers/issues/37197/events
https://github.com/huggingface/transformers/issues/37197
2,965,338,134
I_kwDOCUB6oc6wv3gW
37,197
Gemma3 Gradient Accumulation loss
{ "login": "winglian", "id": 381258, "node_id": "MDQ6VXNlcjM4MTI1OA==", "avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/winglian", "html_url": "https://github.com/winglian", "followers_url": "https://api.github.com/users/winglian/followers", "following_url": "https://api.github.com/users/winglian/following{/other_user}", "gists_url": "https://api.github.com/users/winglian/gists{/gist_id}", "starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/winglian/subscriptions", "organizations_url": "https://api.github.com/users/winglian/orgs", "repos_url": "https://api.github.com/users/winglian/repos", "events_url": "https://api.github.com/users/winglian/events{/privacy}", "received_events_url": "https://api.github.com/users/winglian/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-02T06:39:58
2025-05-11T08:03:06
2025-05-11T08:03:06
CONTRIBUTOR
null
null
null
null
### System Info transformers==4.50.3 ### Who can help? @osanseviero @ArthurZucker ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction repro script with SFTTrainer here: https://gist.github.com/winglian/569924fe154824c8ce148f6e185cd4cd When using `gradient_accumulation_steps=1`, the magnitude of the loss for gemma3 family of models is approximately correct ~1 or less depending on the model size. When changing `gradient_accumulation_steps=4`, the loss is scaled up by a factor of the `gradient_accumulation_steps` and in this case is ~8-10. Digging into the [Gemma3TextModel.forward method ](https://github.com/huggingface/transformers/blob/800510c67bfc5cedd0bb7635648a07f39719be43/src/transformers/models/gemma3/modular_gemma3.py#L599), we see that it doesn't accept the typical `LossKwargs` to handle gradient accumulation. Compare the signature to [llama](https://github.com/huggingface/transformers/blob/800510c67bfc5cedd0bb7635648a07f39719be43/src/transformers/models/llama/modeling_llama.py#L809) where it's defined as ```python class KwargsForCausalLM(FlashAttentionKwargs, LossKwargs): ... ``` ### Expected behavior loss should be ~1.0 regardless of gradient accumulation steps.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37197/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37197/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37196
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37196/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37196/comments
https://api.github.com/repos/huggingface/transformers/issues/37196/events
https://github.com/huggingface/transformers/pull/37196
2,965,270,089
PR_kwDOCUB6oc6RB9Ms
37,196
Added separate limits for saving full checkpoints and model weights
{ "login": "GitGautamHub", "id": 127083244, "node_id": "U_kgDOB5Mi7A", "avatar_url": "https://avatars.githubusercontent.com/u/127083244?v=4", "gravatar_id": "", "url": "https://api.github.com/users/GitGautamHub", "html_url": "https://github.com/GitGautamHub", "followers_url": "https://api.github.com/users/GitGautamHub/followers", "following_url": "https://api.github.com/users/GitGautamHub/following{/other_user}", "gists_url": "https://api.github.com/users/GitGautamHub/gists{/gist_id}", "starred_url": "https://api.github.com/users/GitGautamHub/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GitGautamHub/subscriptions", "organizations_url": "https://api.github.com/users/GitGautamHub/orgs", "repos_url": "https://api.github.com/users/GitGautamHub/repos", "events_url": "https://api.github.com/users/GitGautamHub/events{/privacy}", "received_events_url": "https://api.github.com/users/GitGautamHub/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-04-02T05:58:55
2025-10-25T00:04:02
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37196", "html_url": "https://github.com/huggingface/transformers/pull/37196", "diff_url": "https://github.com/huggingface/transformers/pull/37196.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37196.patch", "merged_at": null }
What does this PR do? This PR introduces separate limits for saving full checkpoints and model weights in the Trainer class. Currently, save_total_limit applies to both, which can lead to inefficient storage management. This change allows users to: Retain intermediate model weights for analysis without storing full checkpoints unnecessarily. Ensure that the latest full checkpoint is available for resuming training. Fixes Issue Fixes #37195 Motivation The current implementation of save_total_limit does not differentiate between full checkpoints and model weights. This can be problematic when users require frequent weight saves but only need a limited number of full checkpoints. Introducing separate limits enhances flexibility and optimizes storage usage. Changes Made Updated _rotate_checkpoints() in trainer.py to support independent limits for full checkpoints and model weights. Introduced save_checkpoint_limit and save_model_weights_limit as configurable parameters. Updated relevant documentation and method docstrings. Tests Due to local environment issues, I was unable to run the tests locally. The CI/CD pipeline should validate the changes.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37196/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37196/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37195
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37195/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37195/comments
https://api.github.com/repos/huggingface/transformers/issues/37195/events
https://github.com/huggingface/transformers/issues/37195
2,965,115,827
I_kwDOCUB6oc6wvBOz
37,195
Different limits for saving only model weights and saving full checkpoints
{ "login": "Tim-Siu", "id": 61866948, "node_id": "MDQ6VXNlcjYxODY2OTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/61866948?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Tim-Siu", "html_url": "https://github.com/Tim-Siu", "followers_url": "https://api.github.com/users/Tim-Siu/followers", "following_url": "https://api.github.com/users/Tim-Siu/following{/other_user}", "gists_url": "https://api.github.com/users/Tim-Siu/gists{/gist_id}", "starred_url": "https://api.github.com/users/Tim-Siu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Tim-Siu/subscriptions", "organizations_url": "https://api.github.com/users/Tim-Siu/orgs", "repos_url": "https://api.github.com/users/Tim-Siu/repos", "events_url": "https://api.github.com/users/Tim-Siu/events{/privacy}", "received_events_url": "https://api.github.com/users/Tim-Siu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-04-02T04:02:24
2025-04-02T05:33:56
null
NONE
null
null
null
null
### Feature request I wonder if it is possible to have different limits for saving full checkpoints and model weights. Currently, only `save_total_limit` control the limits. ### Motivation Sometimes people need intermediate weights for analysis, but saving full checkpoints take too much space. `save_only_model` can be helpful, but we may still want the latest full checkpoint if we need to resume training. In this case, it would be great to have seperate limits for saving full weights and checkpoints. ### Your contribution If this feature is welcome, I can submit a PR.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37195/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37195/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/37194
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37194/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37194/comments
https://api.github.com/repos/huggingface/transformers/issues/37194/events
https://github.com/huggingface/transformers/pull/37194
2,965,081,777
PR_kwDOCUB6oc6RBWIc
37,194
[Feat] Support npu in Qwen2Model._update_causal_mask
{ "login": "duanjunwen", "id": 54985467, "node_id": "MDQ6VXNlcjU0OTg1NDY3", "avatar_url": "https://avatars.githubusercontent.com/u/54985467?v=4", "gravatar_id": "", "url": "https://api.github.com/users/duanjunwen", "html_url": "https://github.com/duanjunwen", "followers_url": "https://api.github.com/users/duanjunwen/followers", "following_url": "https://api.github.com/users/duanjunwen/following{/other_user}", "gists_url": "https://api.github.com/users/duanjunwen/gists{/gist_id}", "starred_url": "https://api.github.com/users/duanjunwen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/duanjunwen/subscriptions", "organizations_url": "https://api.github.com/users/duanjunwen/orgs", "repos_url": "https://api.github.com/users/duanjunwen/repos", "events_url": "https://api.github.com/users/duanjunwen/events{/privacy}", "received_events_url": "https://api.github.com/users/duanjunwen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T03:30:34
2025-04-08T14:55:53
2025-04-08T14:39:33
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37194", "html_url": "https://github.com/huggingface/transformers/pull/37194", "diff_url": "https://github.com/huggingface/transformers/pull/37194.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37194.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "duanjunwen", "id": 54985467, "node_id": "MDQ6VXNlcjU0OTg1NDY3", "avatar_url": "https://avatars.githubusercontent.com/u/54985467?v=4", "gravatar_id": "", "url": "https://api.github.com/users/duanjunwen", "html_url": "https://github.com/duanjunwen", "followers_url": "https://api.github.com/users/duanjunwen/followers", "following_url": "https://api.github.com/users/duanjunwen/following{/other_user}", "gists_url": "https://api.github.com/users/duanjunwen/gists{/gist_id}", "starred_url": "https://api.github.com/users/duanjunwen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/duanjunwen/subscriptions", "organizations_url": "https://api.github.com/users/duanjunwen/orgs", "repos_url": "https://api.github.com/users/duanjunwen/repos", "events_url": "https://api.github.com/users/duanjunwen/events{/privacy}", "received_events_url": "https://api.github.com/users/duanjunwen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37194/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37194/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37193
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37193/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37193/comments
https://api.github.com/repos/huggingface/transformers/issues/37193/events
https://github.com/huggingface/transformers/pull/37193
2,965,060,082
PR_kwDOCUB6oc6RBRrF
37,193
[tests] fix mamba integration simple inference precision issue
{ "login": "faaany", "id": 24477841, "node_id": "MDQ6VXNlcjI0NDc3ODQx", "avatar_url": "https://avatars.githubusercontent.com/u/24477841?v=4", "gravatar_id": "", "url": "https://api.github.com/users/faaany", "html_url": "https://github.com/faaany", "followers_url": "https://api.github.com/users/faaany/followers", "following_url": "https://api.github.com/users/faaany/following{/other_user}", "gists_url": "https://api.github.com/users/faaany/gists{/gist_id}", "starred_url": "https://api.github.com/users/faaany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/faaany/subscriptions", "organizations_url": "https://api.github.com/users/faaany/orgs", "repos_url": "https://api.github.com/users/faaany/repos", "events_url": "https://api.github.com/users/faaany/events{/privacy}", "received_events_url": "https://api.github.com/users/faaany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T03:09:17
2025-04-03T08:38:04
2025-04-03T08:38:04
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37193", "html_url": "https://github.com/huggingface/transformers/pull/37193", "diff_url": "https://github.com/huggingface/transformers/pull/37193.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37193.patch", "merged_at": "2025-04-03T08:38:04" }
## What does this PR do? This test fails both on CUDA and XPU. Root Cause: The model's output sequence assertion can pass, but the logits assertion fails. This indicates that there might be some numerical precision issues during inference with torch.float16, since the model is loaded with `torch_dtype=torch.float16`. To verify my assumption, I changed the torch.float16 to torch.float32. And the tensor outputs on XPU and CUDA are equal to each other. For tasks requiring high precision, such as validation, testing or debugging, `torch.float32` should be used instead of `torch.float16`. Fix: temporarily switch to `torch.float32` ensures that the test is not affected by these precision issues and focuses on validating the model's correctness. After the fix, it works both on CUDA and XPU: `` PASSED tests/models/mamba/test_modeling_mamba.py::MambaIntegrationTests::test_simple_generate_1_cpu PASSED tests/models/mamba/test_modeling_mamba.py::MambaIntegrationTests::test_simple_generate_0_cuda PASSED tests/models/mamba/test_modeling_mamba.py::MambaIntegrationTests::test_simple_generate_1_cpu PASSED tests/models/mamba/test_modeling_mamba.py::MambaIntegrationTests::test_simple_generate_0_xpu ```
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37193/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37193/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37192
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37192/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37192/comments
https://api.github.com/repos/huggingface/transformers/issues/37192/events
https://github.com/huggingface/transformers/pull/37192
2,965,030,642
PR_kwDOCUB6oc6RBLar
37,192
Updated model card for Qwen2
{ "login": "Aravind-11", "id": 42345018, "node_id": "MDQ6VXNlcjQyMzQ1MDE4", "avatar_url": "https://avatars.githubusercontent.com/u/42345018?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aravind-11", "html_url": "https://github.com/Aravind-11", "followers_url": "https://api.github.com/users/Aravind-11/followers", "following_url": "https://api.github.com/users/Aravind-11/following{/other_user}", "gists_url": "https://api.github.com/users/Aravind-11/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aravind-11/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aravind-11/subscriptions", "organizations_url": "https://api.github.com/users/Aravind-11/orgs", "repos_url": "https://api.github.com/users/Aravind-11/repos", "events_url": "https://api.github.com/users/Aravind-11/events{/privacy}", "received_events_url": "https://api.github.com/users/Aravind-11/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T02:46:57
2025-04-03T01:10:41
2025-04-03T01:10:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37192", "html_url": "https://github.com/huggingface/transformers/pull/37192", "diff_url": "https://github.com/huggingface/transformers/pull/37192.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37192.patch", "merged_at": "2025-04-03T01:10:41" }
# What does this PR do? - updated model card for Qwen2. #36979 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? https://github.com/huggingface/transformers/issues/36979 ## Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37192/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37192/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37191
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37191/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37191/comments
https://api.github.com/repos/huggingface/transformers/issues/37191/events
https://github.com/huggingface/transformers/pull/37191
2,964,981,712
PR_kwDOCUB6oc6RBAcs
37,191
Add Fast Image Processor for VideoMAE
{ "login": "miguelscarv", "id": 83602448, "node_id": "MDQ6VXNlcjgzNjAyNDQ4", "avatar_url": "https://avatars.githubusercontent.com/u/83602448?v=4", "gravatar_id": "", "url": "https://api.github.com/users/miguelscarv", "html_url": "https://github.com/miguelscarv", "followers_url": "https://api.github.com/users/miguelscarv/followers", "following_url": "https://api.github.com/users/miguelscarv/following{/other_user}", "gists_url": "https://api.github.com/users/miguelscarv/gists{/gist_id}", "starred_url": "https://api.github.com/users/miguelscarv/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/miguelscarv/subscriptions", "organizations_url": "https://api.github.com/users/miguelscarv/orgs", "repos_url": "https://api.github.com/users/miguelscarv/repos", "events_url": "https://api.github.com/users/miguelscarv/events{/privacy}", "received_events_url": "https://api.github.com/users/miguelscarv/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-04-02T02:21:36
2025-04-23T00:48:51
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37191", "html_url": "https://github.com/huggingface/transformers/pull/37191", "diff_url": "https://github.com/huggingface/transformers/pull/37191.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37191.patch", "merged_at": null }
# What does this PR do? This PR implements a FastImageProcessor for the the VideoMAE model using BaseImageProcessorFast. Related #36978 Note: 1. I was not able to run `test_can_compile_fast_image_processor` as I'm using a M1 and don't have access to a NVIDIA GPU. 2. The test implemented in the method `test_slow_fast_equivalence_batched` is only able to pass when all images in the batch have a fixed width and height (by setting `equal_resolution=True` in the `self.image_processor_tester.prepare_image_inputs`). If the width and height vary across videos, then there is a low chance of passing this test, i.e. it doesn't fail every time. Other than that, all tests passed. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @yonigozlan Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37191/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37191/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37190
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37190/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37190/comments
https://api.github.com/repos/huggingface/transformers/issues/37190/events
https://github.com/huggingface/transformers/pull/37190
2,964,885,754
PR_kwDOCUB6oc6RAsGt
37,190
fix test device spec relative path importing issue
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T01:11:57
2025-04-07T03:42:57
2025-04-04T16:22:55
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37190", "html_url": "https://github.com/huggingface/transformers/pull/37190", "diff_url": "https://github.com/huggingface/transformers/pull/37190.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37190.patch", "merged_at": "2025-04-04T16:22:55" }
**Symptom** when run unit test `pytest -rA tests/deepspeed/test_deepspeed.py::TestDeepSpeedWithLauncher::test_clm_from_config_zero3_fp16` on XPU with `spec_xpu.py` by following this [doc](https://huggingface.co/docs/transformers/en/testing). Error with below log > stderr: Traceback (most recent call last): > stderr: File "/.tests/transformers/examples/pytorch/language-modeling/run_clm.py", line 51, in <module> > stderr: from transformers.testing_utils import CaptureLogger > stderr: File "/.tests/transformers/src/transformers/testing_utils.py", line 2869, in <module> > stderr: device_spec_module = importlib.import_module(import_name) > stderr: File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module > stderr: return _bootstrap._gcd_import(name[level:], package, level) > stderr: ModuleNotFoundError: No module named 'spec_xpu' **Root Cause** since `export TRANSFORMERS_TEST_DEVICE_SPEC="spec_xpu.py"` cannot adding absolute path by testing design, there is a implication that pytest cases are running in transformers root directory. While in this deepspeed case, one process will be launched by deepspeed in `tests/deepspeed` directory which break this assumption, so it cannot find `spec_xpu` module in its current folder(tests/deepspeed) or in PYTHONPATH, then the error arose. **My Fix** append `device_spec_dir` to `sys.path`.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37190/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37190/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37189
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37189/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37189/comments
https://api.github.com/repos/huggingface/transformers/issues/37189/events
https://github.com/huggingface/transformers/issues/37189
2,964,852,659
I_kwDOCUB6oc6wuA-z
37,189
Bug when using StaticCache in Qwen2.5 Inference with custom inputs_embeds and attention_masks
{ "login": "matthewdm0816", "id": 2143312, "node_id": "MDQ6VXNlcjIxNDMzMTI=", "avatar_url": "https://avatars.githubusercontent.com/u/2143312?v=4", "gravatar_id": "", "url": "https://api.github.com/users/matthewdm0816", "html_url": "https://github.com/matthewdm0816", "followers_url": "https://api.github.com/users/matthewdm0816/followers", "following_url": "https://api.github.com/users/matthewdm0816/following{/other_user}", "gists_url": "https://api.github.com/users/matthewdm0816/gists{/gist_id}", "starred_url": "https://api.github.com/users/matthewdm0816/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/matthewdm0816/subscriptions", "organizations_url": "https://api.github.com/users/matthewdm0816/orgs", "repos_url": "https://api.github.com/users/matthewdm0816/repos", "events_url": "https://api.github.com/users/matthewdm0816/events{/privacy}", "received_events_url": "https://api.github.com/users/matthewdm0816/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-02T00:41:08
2025-05-11T08:03:10
2025-05-11T08:03:10
NONE
null
null
null
null
### System Info - `transformers` version: 4.50.3 - Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35 - Python version: 3.12.9 - Huggingface_hub version: 0.29.3 - Safetensors version: 0.5.3 - Accelerate version: 1.5.2 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type: MULTI_GPU - mixed_precision: bf16 - use_cpu: False - debug: False - num_processes: 8 - machine_rank: 0 - num_machines: 1 - gpu_ids: all - rdzv_backend: static - same_network: True - main_training_function: main - enable_cpu_affinity: False - downcast_bf16: no - tpu_use_cluster: False - tpu_use_sudo: False - tpu_env: [] - DeepSpeed version: not installed - PyTorch version (GPU?): 2.6.0+cu124 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: <fill in> - Using GPU in script?: <fill in> - GPU type: NVIDIA GeForce RTX 3090 ### Who can help? _No response_ ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction When using Qwen2.5 models with custom `inputs_embeds` and `attention_mask` together with static KV cache implementation and flash attention 2, the generation fails with error: ``` File "/hd1/xxx/miniconda3/envs/vllm/lib/python3.12/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 623, in _update_causal_mask raise ValueError( ValueError: You are attempting to perform batched generation with padding_side='right' this may lead to unexpected behaviour for Flash Attention version of Qwen2. Make sure to call `tokenizer.padding_side = 'left'` before tokenizing the input. ``` The issue can be reproduced with the following minimal example: ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer def reproduce_bug(): # Load Qwen2.5 model model_name = "Qwen/Qwen2.5-1.5B-Instruct" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained( model_name, torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2", ) model = model.cuda() tokenizer.padding_side = "left" # Set padding to the left side model.generation_config.cache_implementation = "static" # Use static cache model.generation_config.use_cache = True # Enable KV cache text = "Hello, my name is" input_ids = tokenizer(text, return_tensors="pt").input_ids.to(model.device) # Get some inputs_embeds with torch.no_grad(): inputs_embeds = model.get_input_embeddings()(input_ids) # Create custom attention_mask batch_size = 2 seq_length = input_ids.shape[1] # Duplicate embeddings to create a batch batched_embeds = inputs_embeds.repeat(batch_size, 1, 1) # Create custom attention_mask attention_masks = torch.ones((batch_size, seq_length), dtype=torch.long, device=model.device) # Try to generate using custom inputs_embeds and attention_mask outputs = model.generate( inputs_embeds=batched_embeds, attention_mask=attention_masks, pad_token_id=tokenizer.pad_token_id, eos_token_id=tokenizer.eos_token_id, ) if __name__ == "__main__": reproduce_bug() ``` ### Expected behavior The model succeeds to generate.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37189/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37189/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37188
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37188/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37188/comments
https://api.github.com/repos/huggingface/transformers/issues/37188/events
https://github.com/huggingface/transformers/pull/37188
2,964,845,546
PR_kwDOCUB6oc6RAjsm
37,188
allow custom head_dim for qwen2_moe
{ "login": "bzantium", "id": 19511788, "node_id": "MDQ6VXNlcjE5NTExNzg4", "avatar_url": "https://avatars.githubusercontent.com/u/19511788?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bzantium", "html_url": "https://github.com/bzantium", "followers_url": "https://api.github.com/users/bzantium/followers", "following_url": "https://api.github.com/users/bzantium/following{/other_user}", "gists_url": "https://api.github.com/users/bzantium/gists{/gist_id}", "starred_url": "https://api.github.com/users/bzantium/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bzantium/subscriptions", "organizations_url": "https://api.github.com/users/bzantium/orgs", "repos_url": "https://api.github.com/users/bzantium/repos", "events_url": "https://api.github.com/users/bzantium/events{/privacy}", "received_events_url": "https://api.github.com/users/bzantium/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-02T00:33:43
2025-06-05T00:24:55
2025-06-04T12:27:30
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37188", "html_url": "https://github.com/huggingface/transformers/pull/37188", "diff_url": "https://github.com/huggingface/transformers/pull/37188.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37188.patch", "merged_at": "2025-06-04T12:27:30" }
# What does this PR do? allow head_dim for qwen2_moe. (qwen2, qwen3, qwen3_moe models are possible currently.) Fixes #37187 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @ArthurZucker
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37188/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37188/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37187
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37187/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37187/comments
https://api.github.com/repos/huggingface/transformers/issues/37187/events
https://github.com/huggingface/transformers/issues/37187
2,964,840,618
I_kwDOCUB6oc6wt-Cq
37,187
allow custom head_dim for qwen2_moe
{ "login": "bzantium", "id": 19511788, "node_id": "MDQ6VXNlcjE5NTExNzg4", "avatar_url": "https://avatars.githubusercontent.com/u/19511788?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bzantium", "html_url": "https://github.com/bzantium", "followers_url": "https://api.github.com/users/bzantium/followers", "following_url": "https://api.github.com/users/bzantium/following{/other_user}", "gists_url": "https://api.github.com/users/bzantium/gists{/gist_id}", "starred_url": "https://api.github.com/users/bzantium/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bzantium/subscriptions", "organizations_url": "https://api.github.com/users/bzantium/orgs", "repos_url": "https://api.github.com/users/bzantium/repos", "events_url": "https://api.github.com/users/bzantium/events{/privacy}", "received_events_url": "https://api.github.com/users/bzantium/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-04-02T00:29:26
2025-06-04T12:27:31
2025-06-04T12:27:31
CONTRIBUTOR
null
null
null
null
### Feature request allow to use custom head_dim for qwen2_moe. (qwen2, qwen3, qwen3_moe models are possible currently.) ### Motivation I want to use train model with custom head_dim: not `self.head_dim = config.hidden_size // config.num_attention_heads` but `self.head_dim = getattr(config, "head_dim", config.hidden_size // config.num_attention_heads)` ### Your contribution slightly change modeling code to allow custom head_dim.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37187/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37187/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37186
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37186/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37186/comments
https://api.github.com/repos/huggingface/transformers/issues/37186/events
https://github.com/huggingface/transformers/issues/37186
2,964,597,222
I_kwDOCUB6oc6wtCnm
37,186
Quen FSDP model training hangs when some batches do not contain images
{ "login": "gbarello-uipath", "id": 48561156, "node_id": "MDQ6VXNlcjQ4NTYxMTU2", "avatar_url": "https://avatars.githubusercontent.com/u/48561156?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gbarello-uipath", "html_url": "https://github.com/gbarello-uipath", "followers_url": "https://api.github.com/users/gbarello-uipath/followers", "following_url": "https://api.github.com/users/gbarello-uipath/following{/other_user}", "gists_url": "https://api.github.com/users/gbarello-uipath/gists{/gist_id}", "starred_url": "https://api.github.com/users/gbarello-uipath/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gbarello-uipath/subscriptions", "organizations_url": "https://api.github.com/users/gbarello-uipath/orgs", "repos_url": "https://api.github.com/users/gbarello-uipath/repos", "events_url": "https://api.github.com/users/gbarello-uipath/events{/privacy}", "received_events_url": "https://api.github.com/users/gbarello-uipath/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-01T21:19:23
2025-05-29T00:25:20
2025-05-11T08:03:13
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.49.0 - Platform: Linux-6.8.0-1025-gcp-x86_64-with-glibc2.39 - Python version: 3.11.10 - Huggingface_hub version: 0.29.3 - Safetensors version: 0.5.3 - Accelerate version: 0.34.2 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type: MULTI_GPU - mixed_precision: no - use_cpu: False - debug: False - num_processes: 8 - machine_rank: 0 - num_machines: 1 - gpu_ids: all - rdzv_backend: static - same_network: True - main_training_function: main - enable_cpu_affinity: False - downcast_bf16: no - tpu_use_cluster: False - tpu_use_sudo: False - tpu_env: [] - DeepSpeed version: not installed - PyTorch version (GPU?): 2.6.0+cu124 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: <fill in> - Using GPU in script?: <fill in> - GPU type: NVIDIA H100 80GB HBM3 ### Who can help? @amyeroberts @qubvel ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction I suspect this is because the vision transformer is not called for the batch without images and thus the FSDP gather/scatter ops are not called in that process. Though this is a bit strange as when I ran the following script with a loop around the forward/backward calls it ran through to the end and only hung on the _final_ backward call. The script at the bottom of this comment reproduces this behavior when run with the following command: ``` CUDA_VISIBLE_DEVICES=0,1 accelerate launch qwen_multimodal_test.py --run_style mismatch ``` using the following accelerate config: ``` compute_environment: LOCAL_MACHINE debug: false distributed_type: FSDP downcast_bf16: 'no' enable_cpu_affinity: false fsdp_config: fsdp_activation_checkpointing: false fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP fsdp_backward_prefetch: BACKWARD_PRE fsdp_cpu_ram_efficient_loading: false fsdp_forward_prefetch: true fsdp_offload_params: false fsdp_sharding_strategy: FULL_SHARD fsdp_state_dict_type: FULL_STATE_DICT fsdp_sync_module_states: false fsdp_transformer_layer_cls_to_wrap: 'Qwen2VLDecoderLayer,Qwen2VLVisionBlock' fsdp_use_orig_params: false machine_rank: 0 main_training_function: main mixed_precision: 'no' num_machines: 1 num_processes: 2 rdzv_backend: static same_network: true tpu_env: [] tpu_use_cluster: false tpu_use_sudo: false use_cpu: false ``` ``` import os import torch import torch.distributed as dist from enum import StrEnum, auto from accelerate import Accelerator from transformers import AutoProcessor, Qwen2VLForConditionalGeneration from PIL import Image import io from simple_parsing import parse from dataclasses import dataclass from typing import Literal class RunStyle(StrEnum): image = auto() text = auto() mismatch = auto() @dataclass class Args: run_style: RunStyle """ If "image" all processes will get image inputs If "text" all processes will get text only If "mismatch" one process will get no image """ def setup_distributed(): os.environ["MASTER_ADDR"] = "localhost" os.environ["MASTER_PORT"] = "29500" dist.init_process_group(backend="nccl", init_method="env://") def cleanup_distributed(): dist.destroy_process_group() def print_pretty(message: str): rank = dist.get_rank() print(f"Rank {rank}: {message}") def test_qwen_multimodal_fsdp(run_style: RunStyle): # Setup distributed environment print_pretty("starting") # Get rank and world size rank = dist.get_rank() # Initialize accelerator accelerator = Accelerator() # Load model and tokenizer model_name = "Qwen/Qwen2-VL-7B" # Replace with your actual model path model = Qwen2VLForConditionalGeneration.from_pretrained( model_name, torch_dtype=torch.float16, device_map=None, ) processor = AutoProcessor.from_pretrained(model_name) # Prepare model with FSDP model = accelerator.prepare(model) # Create example image (only for rank 0) if run_style == RunStyle.image or (run_style == RunStyle.mismatch and rank != 0): # Create a dummy image (1x1 pixel) text = "test this image <|vision_start|><|image_pad|><|vision_end|>" image = [Image.new('RGB', (100, 100), color='red')] elif run_style == RunStyle.text or (run_style == RunStyle.mismatch and rank == 0): text = "test this image" image = None else: raise ValueError() # Prepare inputs inputs = processor( text = text, images = image, return_tensors="pt", ) # Move inputs to device inputs = {k: v.to(accelerator.device) for k, v in inputs.items()} inputs["labels"] = inputs["input_ids"].clone() # Forward pass print_pretty("Forward Pass") outputs = model(**inputs) # Backward pass (with dummy loss) loss = outputs.loss print_pretty("Backward Pass") accelerator.backward(loss) print_pretty("Backward Done!") # Cleanup cleanup_distributed() if __name__ == "__main__": args = parse(config_class=Args) setup_distributed() if dist.get_rank() == 0: print(f"Running with run style {args.run_style}") test_qwen_multimodal_fsdp(run_style = args.run_style) ``` ### Expected behavior I expect the model to not hang during the forward/backward passes
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37186/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37186/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37185
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37185/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37185/comments
https://api.github.com/repos/huggingface/transformers/issues/37185/events
https://github.com/huggingface/transformers/pull/37185
2,964,584,576
PR_kwDOCUB6oc6Q_rSB
37,185
Fix setting FLASH_ATTENTION_DETERMINISTIC after importing
{ "login": "HollowMan6", "id": 43995067, "node_id": "MDQ6VXNlcjQzOTk1MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/43995067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HollowMan6", "html_url": "https://github.com/HollowMan6", "followers_url": "https://api.github.com/users/HollowMan6/followers", "following_url": "https://api.github.com/users/HollowMan6/following{/other_user}", "gists_url": "https://api.github.com/users/HollowMan6/gists{/gist_id}", "starred_url": "https://api.github.com/users/HollowMan6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HollowMan6/subscriptions", "organizations_url": "https://api.github.com/users/HollowMan6/orgs", "repos_url": "https://api.github.com/users/HollowMan6/repos", "events_url": "https://api.github.com/users/HollowMan6/events{/privacy}", "received_events_url": "https://api.github.com/users/HollowMan6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T21:11:28
2025-06-02T09:08:52
2025-06-02T09:08:21
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37185", "html_url": "https://github.com/huggingface/transformers/pull/37185", "diff_url": "https://github.com/huggingface/transformers/pull/37185.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37185.patch", "merged_at": "2025-06-02T09:08:21" }
# What does this PR do? transformers.enable_full_determinism enables deterministic flash attention using `FLASH_ATTENTION_DETERMINISTIC` https://github.com/huggingface/transformers/blob/800510c67bfc5cedd0bb7635648a07f39719be43/src/transformers/trainer_utils.py#L79 However, current checks use a global variable `deterministic_g`, which will do the environment variable check as soon as importing, this will cause issues as users can call `transformers.enable_full_determinism` after `transformers.modeling_flash_attention_utils` is imported. This behavior is introduced in https://github.com/huggingface/transformers/pull/33932/files#r1806668579 to fix the graph break. As a result, this PR implement fixes by delaying the environment variable check to the first time when `_flash_attention_forward` is executed, so that we can fix this issue and we won't introduce a graph break. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 --> @ani300 @ArthurZucker
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37185/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37185/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37184
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37184/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37184/comments
https://api.github.com/repos/huggingface/transformers/issues/37184/events
https://github.com/huggingface/transformers/pull/37184
2,964,354,020
PR_kwDOCUB6oc6Q-4vz
37,184
Update falcon model card
{ "login": "ricalanis", "id": 3820751, "node_id": "MDQ6VXNlcjM4MjA3NTE=", "avatar_url": "https://avatars.githubusercontent.com/u/3820751?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ricalanis", "html_url": "https://github.com/ricalanis", "followers_url": "https://api.github.com/users/ricalanis/followers", "following_url": "https://api.github.com/users/ricalanis/following{/other_user}", "gists_url": "https://api.github.com/users/ricalanis/gists{/gist_id}", "starred_url": "https://api.github.com/users/ricalanis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ricalanis/subscriptions", "organizations_url": "https://api.github.com/users/ricalanis/orgs", "repos_url": "https://api.github.com/users/ricalanis/repos", "events_url": "https://api.github.com/users/ricalanis/events{/privacy}", "received_events_url": "https://api.github.com/users/ricalanis/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T19:11:31
2025-04-03T00:30:37
2025-04-03T00:30:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37184", "html_url": "https://github.com/huggingface/transformers/pull/37184", "diff_url": "https://github.com/huggingface/transformers/pull/37184.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37184.patch", "merged_at": "2025-04-03T00:30:37" }
Fixes #36979 * Updated the Falcon model card * Did not update Falcon3 as it is not explicitly listed, but can do it also * Did not include the attention visualizer as I was not able to implement it for Falcon models in a straightforward way. * First contrib, thank you for your patience <3 ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [X] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [] Did you write any new necessary tests? NA
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37184/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37184/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37183
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37183/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37183/comments
https://api.github.com/repos/huggingface/transformers/issues/37183/events
https://github.com/huggingface/transformers/issues/37183
2,964,303,016
I_kwDOCUB6oc6wr6yo
37,183
TapasTokenizer Produces All Zero token_type_ids Even with Tutorial Data
{ "login": "optionsraghu", "id": 15772927, "node_id": "MDQ6VXNlcjE1NzcyOTI3", "avatar_url": "https://avatars.githubusercontent.com/u/15772927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/optionsraghu", "html_url": "https://github.com/optionsraghu", "followers_url": "https://api.github.com/users/optionsraghu/followers", "following_url": "https://api.github.com/users/optionsraghu/following{/other_user}", "gists_url": "https://api.github.com/users/optionsraghu/gists{/gist_id}", "starred_url": "https://api.github.com/users/optionsraghu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/optionsraghu/subscriptions", "organizations_url": "https://api.github.com/users/optionsraghu/orgs", "repos_url": "https://api.github.com/users/optionsraghu/repos", "events_url": "https://api.github.com/users/optionsraghu/events{/privacy}", "received_events_url": "https://api.github.com/users/optionsraghu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T18:49:36
2025-06-04T08:03:02
2025-06-04T08:03:02
NONE
null
null
null
null
Dear Hugging Face Transformers Team, I am encountering a persistent issue with TapasTokenizer (version 4.50.3) where it consistently produces token_type_ids filled with zeros, even when tokenizing a Pandas DataFrame directly recreated from the first table example in the official fine-tuning tutorial for TAPAS on SQA by Niels Rogge (https://github.com/NielsRogge/Transformers-Tutorials/blob/master/TAPAS/Fine_tuning_TapasForQuestionAnswering_on_SQA.ipynb). Subject: TapasTokenizer Produces All Zero token_type_ids Even with Tutorial Data Dear Hugging Face Transformers Team, I am encountering a persistent issue with TapasTokenizer (version 4.50.3) where it consistently produces token_type_ids filled with zeros, even when tokenizing a Pandas DataFrame directly recreated from the first table example in the official fine-tuning tutorial for TAPAS on SQA by Niels Rogge (https://github.com/NielsRogge/Transformers-Tutorials/blob/master/TAPAS/Fine_tuning_TapasForQuestionAnswering_on_SQA.ipynb). 1 1. github.com github.com I have tried various steps, including: Using both google/tapas-base and google/tapas-base-finetuned-wtq tokenizers. Ensuring the Pandas DataFrame is entirely of string type. Explicitly passing headers. Loading table data from a list of dictionaries and a CSV file. Testing with a minimal, simple string-based DataFrame (which did produce non-zero token_type_ids). The issue persists specifically with DataFrames that have the structure and content similar to the tutorial example and my own data. I am using Python [Your Python Version] on [Your Operating System] with transformers version 4.50.3 . I would appreciate any insights or assistance in resolving this issue. I can provide the code snippets I've been using for testing. Thank you for your time and effort. Sincerely, Raghu
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37183/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37183/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37182
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37182/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37182/comments
https://api.github.com/repos/huggingface/transformers/issues/37182/events
https://github.com/huggingface/transformers/pull/37182
2,964,290,713
PR_kwDOCUB6oc6Q-q4D
37,182
Add Fast Image Processor for PoolFormer
{ "login": "rootonchair", "id": 23548268, "node_id": "MDQ6VXNlcjIzNTQ4MjY4", "avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rootonchair", "html_url": "https://github.com/rootonchair", "followers_url": "https://api.github.com/users/rootonchair/followers", "following_url": "https://api.github.com/users/rootonchair/following{/other_user}", "gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}", "starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions", "organizations_url": "https://api.github.com/users/rootonchair/orgs", "repos_url": "https://api.github.com/users/rootonchair/repos", "events_url": "https://api.github.com/users/rootonchair/events{/privacy}", "received_events_url": "https://api.github.com/users/rootonchair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T18:43:42
2025-04-23T19:55:33
2025-04-23T19:55:33
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37182", "html_url": "https://github.com/huggingface/transformers/pull/37182", "diff_url": "https://github.com/huggingface/transformers/pull/37182.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37182.patch", "merged_at": "2025-04-23T19:55:33" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Related #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37182/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37182/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37181
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37181/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37181/comments
https://api.github.com/repos/huggingface/transformers/issues/37181/events
https://github.com/huggingface/transformers/issues/37181
2,964,247,074
I_kwDOCUB6oc6wrtIi
37,181
Bug in Paligemma usage docs for v4.50.3
{ "login": "EricCousineau-TRI", "id": 26719449, "node_id": "MDQ6VXNlcjI2NzE5NDQ5", "avatar_url": "https://avatars.githubusercontent.com/u/26719449?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EricCousineau-TRI", "html_url": "https://github.com/EricCousineau-TRI", "followers_url": "https://api.github.com/users/EricCousineau-TRI/followers", "following_url": "https://api.github.com/users/EricCousineau-TRI/following{/other_user}", "gists_url": "https://api.github.com/users/EricCousineau-TRI/gists{/gist_id}", "starred_url": "https://api.github.com/users/EricCousineau-TRI/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EricCousineau-TRI/subscriptions", "organizations_url": "https://api.github.com/users/EricCousineau-TRI/orgs", "repos_url": "https://api.github.com/users/EricCousineau-TRI/repos", "events_url": "https://api.github.com/users/EricCousineau-TRI/events{/privacy}", "received_events_url": "https://api.github.com/users/EricCousineau-TRI/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T18:22:50
2025-04-02T16:24:12
2025-04-02T16:24:11
NONE
null
null
null
null
The following doc version has a minor bug in its usage: https://github.com/huggingface/transformers/blob/v4.50.3/docs/source/en/model_doc/paligemma.md#single-image-inference At the time of writing, this is the default version of the docs people come across via https://huggingface.co/docs/transformers/en/model_doc/paligemma This has bug where it's using the tokenized input length: ```py print(processor.decode(output[0], skip_special_tokens=True)[inputs.input_ids.shape[1]: ]) ``` but it should actually be the text input length: ```py print(processor.decode(output[0], skip_special_tokens=True)[len(prompt): ]) ``` Found this out by cross-referencing the HF spaces example: https://huggingface.co/spaces/big-vision/paligemma-hf/blob/d914d44/app.py#L38 Note sure if it's worth patching the existing docs, having a new minor release, or just closing this out as a note for others.
{ "login": "EricCousineau-TRI", "id": 26719449, "node_id": "MDQ6VXNlcjI2NzE5NDQ5", "avatar_url": "https://avatars.githubusercontent.com/u/26719449?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EricCousineau-TRI", "html_url": "https://github.com/EricCousineau-TRI", "followers_url": "https://api.github.com/users/EricCousineau-TRI/followers", "following_url": "https://api.github.com/users/EricCousineau-TRI/following{/other_user}", "gists_url": "https://api.github.com/users/EricCousineau-TRI/gists{/gist_id}", "starred_url": "https://api.github.com/users/EricCousineau-TRI/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EricCousineau-TRI/subscriptions", "organizations_url": "https://api.github.com/users/EricCousineau-TRI/orgs", "repos_url": "https://api.github.com/users/EricCousineau-TRI/repos", "events_url": "https://api.github.com/users/EricCousineau-TRI/events{/privacy}", "received_events_url": "https://api.github.com/users/EricCousineau-TRI/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37181/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37181/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37180
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37180/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37180/comments
https://api.github.com/repos/huggingface/transformers/issues/37180/events
https://github.com/huggingface/transformers/pull/37180
2,964,194,336
PR_kwDOCUB6oc6Q-Vyy
37,180
Add ImageProcessorFast to BiT processor
{ "login": "Yann-CV", "id": 54800486, "node_id": "MDQ6VXNlcjU0ODAwNDg2", "avatar_url": "https://avatars.githubusercontent.com/u/54800486?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Yann-CV", "html_url": "https://github.com/Yann-CV", "followers_url": "https://api.github.com/users/Yann-CV/followers", "following_url": "https://api.github.com/users/Yann-CV/following{/other_user}", "gists_url": "https://api.github.com/users/Yann-CV/gists{/gist_id}", "starred_url": "https://api.github.com/users/Yann-CV/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Yann-CV/subscriptions", "organizations_url": "https://api.github.com/users/Yann-CV/orgs", "repos_url": "https://api.github.com/users/Yann-CV/repos", "events_url": "https://api.github.com/users/Yann-CV/events{/privacy}", "received_events_url": "https://api.github.com/users/Yann-CV/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T18:01:32
2025-04-14T15:07:48
2025-04-14T15:07:48
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37180", "html_url": "https://github.com/huggingface/transformers/pull/37180", "diff_url": "https://github.com/huggingface/transformers/pull/37180.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37180.patch", "merged_at": "2025-04-14T15:07:48" }
# What does this PR do? Following https://github.com/huggingface/transformers/issues/36978: This pull request introduces a new fast image processor class for the BiT model along with the necessary updates to integrate it into the existing codebase. The most important changes include the addition of the `BitImageProcessorFast` class, modifications to the initialization and import structures, and updates to the documentation and tests. ### New Feature Addition: * [`src/transformers/models/bit/image_processing_bit_fast.py`](diffhunk://#diff-351beedf180b467145940c5f6e809b0ff034eb93a564d49ec7da4bb4003bd0d7R1-R59): Added the new `BitImageProcessorFast` class, which includes methods for preprocessing images with various settings and parameters. ### Integration Updates: * [`src/transformers/__init__.py`](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R1348): Updated the import structure to include `BitImageProcessorFast` in the `models.bit` module. [[1]](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R1348) [[2]](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R6641) * [`src/transformers/models/auto/image_processing_auto.py`](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL61-R61): Added `BitImageProcessorFast` to the list of image processors for the `bit` model. [[1]](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL61-R61) [[2]](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL81-R95) * [`src/transformers/models/bit/__init__.py`](diffhunk://#diff-2cff440fc284a759da756f46aeb80c4d86f34690d2ae5e67878f2fe06e3b4382R23): Included `BitImageProcessorFast` in the type checking imports. ### Documentation Updates: * `docs/source/en/model_doc/bit.md` and `docs/source/ja/model_doc/bit.md`: Added documentation entries for the new `BitImageProcessorFast` class. [[1]](diffhunk://#diff-baa34b2511c05dceaf27e94a1b42a3cb65155958207eaf23997bc46aa6299e97R61-R65) [[2]](diffhunk://#diff-ca7bacda9902808adc6d6c009db15c3df79465cf380a77dc4ded7820b87dc09bR57-R61) ### Testing: * [`tests/models/bit/test_image_processing_bit.py`](diffhunk://#diff-43451340d4a1ee4fe3d69a2555b3a9d65dece62f135eb59e4bca16e16a9f8c69R1-R104): Added unit tests for `BitImageProcessorFast` to ensure proper functionality and integration. ### Dummy Objects: * [`src/transformers/utils/dummy_torchvision_objects.py`](diffhunk://#diff-628ebb72d86e41f003144eae7255f5f0e9e60bbf5cb742b5b5447a0360cf1968R12-R18): Added a dummy class for `BitImageProcessorFast` to handle cases where `torchvision` is not available. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests?
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37180/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37180/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37179
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37179/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37179/comments
https://api.github.com/repos/huggingface/transformers/issues/37179/events
https://github.com/huggingface/transformers/pull/37179
2,963,973,832
PR_kwDOCUB6oc6Q9lCh
37,179
[doc] Fix link for Quark quantization page
{ "login": "BowenBao", "id": 9376104, "node_id": "MDQ6VXNlcjkzNzYxMDQ=", "avatar_url": "https://avatars.githubusercontent.com/u/9376104?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BowenBao", "html_url": "https://github.com/BowenBao", "followers_url": "https://api.github.com/users/BowenBao/followers", "following_url": "https://api.github.com/users/BowenBao/following{/other_user}", "gists_url": "https://api.github.com/users/BowenBao/gists{/gist_id}", "starred_url": "https://api.github.com/users/BowenBao/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BowenBao/subscriptions", "organizations_url": "https://api.github.com/users/BowenBao/orgs", "repos_url": "https://api.github.com/users/BowenBao/repos", "events_url": "https://api.github.com/users/BowenBao/events{/privacy}", "received_events_url": "https://api.github.com/users/BowenBao/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T16:37:20
2025-04-01T18:57:39
2025-04-01T18:57:39
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37179", "html_url": "https://github.com/huggingface/transformers/pull/37179", "diff_url": "https://github.com/huggingface/transformers/pull/37179.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37179.patch", "merged_at": "2025-04-01T18:57:38" }
# What does this PR do? Fixes link to Quark page in quantization docs, the previous link shows 404. ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? cc @SunMarc @MekkCyber for review
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37179/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37179/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37178
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37178/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37178/comments
https://api.github.com/repos/huggingface/transformers/issues/37178/events
https://github.com/huggingface/transformers/pull/37178
2,963,875,579
PR_kwDOCUB6oc6Q9Pdi
37,178
Revert #37031
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T15:53:35
2025-04-05T12:09:44
2025-04-01T17:48:16
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37178", "html_url": "https://github.com/huggingface/transformers/pull/37178", "diff_url": "https://github.com/huggingface/transformers/pull/37178.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37178.patch", "merged_at": "2025-04-01T17:48:16" }
# What does this PR do? https://github.com/huggingface/transformers/pull/37031 caused much much longer loading times (see https://github.com/huggingface/transformers/issues/37160) basically causing going from ~3/4 min to 10+ hours for Deepseek R1. I'm fully reverting for now. I think (not completely sure) the issue is actually accessing the state dict over and over again, not so much the cloning (copying). I will investigate it later to see if I can re-add the fix without time overhead.
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37178/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37178/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37177
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37177/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37177/comments
https://api.github.com/repos/huggingface/transformers/issues/37177/events
https://github.com/huggingface/transformers/pull/37177
2,963,866,050
PR_kwDOCUB6oc6Q9NZB
37,177
trying custom tokenizer fix
{ "login": "itazap", "id": 31893021, "node_id": "MDQ6VXNlcjMxODkzMDIx", "avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4", "gravatar_id": "", "url": "https://api.github.com/users/itazap", "html_url": "https://github.com/itazap", "followers_url": "https://api.github.com/users/itazap/followers", "following_url": "https://api.github.com/users/itazap/following{/other_user}", "gists_url": "https://api.github.com/users/itazap/gists{/gist_id}", "starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/itazap/subscriptions", "organizations_url": "https://api.github.com/users/itazap/orgs", "repos_url": "https://api.github.com/users/itazap/repos", "events_url": "https://api.github.com/users/itazap/events{/privacy}", "received_events_url": "https://api.github.com/users/itazap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-04-01T15:49:43
2025-07-16T16:01:55
null
COLLABORATOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37177", "html_url": "https://github.com/huggingface/transformers/pull/37177", "diff_url": "https://github.com/huggingface/transformers/pull/37177.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37177.patch", "merged_at": null }
DRAFT fixes #35597
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37177/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37177/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37176
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37176/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37176/comments
https://api.github.com/repos/huggingface/transformers/issues/37176/events
https://github.com/huggingface/transformers/pull/37176
2,963,857,321
PR_kwDOCUB6oc6Q9Lf3
37,176
Add Fast Image Processor for Perceiver
{ "login": "rootonchair", "id": 23548268, "node_id": "MDQ6VXNlcjIzNTQ4MjY4", "avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rootonchair", "html_url": "https://github.com/rootonchair", "followers_url": "https://api.github.com/users/rootonchair/followers", "following_url": "https://api.github.com/users/rootonchair/following{/other_user}", "gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}", "starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions", "organizations_url": "https://api.github.com/users/rootonchair/orgs", "repos_url": "https://api.github.com/users/rootonchair/repos", "events_url": "https://api.github.com/users/rootonchair/events{/privacy}", "received_events_url": "https://api.github.com/users/rootonchair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T15:46:21
2025-04-15T18:03:32
2025-04-14T11:49:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37176", "html_url": "https://github.com/huggingface/transformers/pull/37176", "diff_url": "https://github.com/huggingface/transformers/pull/37176.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37176.patch", "merged_at": "2025-04-14T11:49:13" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Related #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37176/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37176/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37175
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37175/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37175/comments
https://api.github.com/repos/huggingface/transformers/issues/37175/events
https://github.com/huggingface/transformers/pull/37175
2,963,758,466
PR_kwDOCUB6oc6Q81s0
37,175
[Tests] add `min_new_tokens` to prevent flaky length checks
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T15:12:08
2025-04-02T14:24:03
2025-04-02T14:24:00
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37175", "html_url": "https://github.com/huggingface/transformers/pull/37175", "diff_url": "https://github.com/huggingface/transformers/pull/37175.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37175.patch", "merged_at": "2025-04-02T14:24:00" }
# What does this PR do? Sets `min_new_tokens` to the same value as `max_new_tokens` in a few checks to prevent failures in length-related checks. These checks expect that we generate exactly `max_new_tokens` tokens, and we might stop earlier due to stopping criteria that's often on by defaulta (e.g. due to eos token) E.g. <img width="1377" alt="Screenshot 2025-04-01 at 16 11 47" src="https://github.com/user-attachments/assets/7c1877b5-668f-41f6-bcb3-93101c58094d" /> this one is no longer flaky :) (`py.test tests/models/pix2struct/test_modeling_pix2struct.py::Pix2StructModelTest::test_greedy_generate_dict_outputs_use_cache --flake-finder --flake-runs 1000`, previously had a ~3% failure rate)
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37175/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37175/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37174
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37174/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37174/comments
https://api.github.com/repos/huggingface/transformers/issues/37174/events
https://github.com/huggingface/transformers/pull/37174
2,963,673,100
PR_kwDOCUB6oc6Q8jCJ
37,174
add image processor for table transformer, and fix value inconsistency
{ "login": "zhouksh", "id": 3754366, "node_id": "MDQ6VXNlcjM3NTQzNjY=", "avatar_url": "https://avatars.githubusercontent.com/u/3754366?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhouksh", "html_url": "https://github.com/zhouksh", "followers_url": "https://api.github.com/users/zhouksh/followers", "following_url": "https://api.github.com/users/zhouksh/following{/other_user}", "gists_url": "https://api.github.com/users/zhouksh/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhouksh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhouksh/subscriptions", "organizations_url": "https://api.github.com/users/zhouksh/orgs", "repos_url": "https://api.github.com/users/zhouksh/repos", "events_url": "https://api.github.com/users/zhouksh/events{/privacy}", "received_events_url": "https://api.github.com/users/zhouksh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T14:41:06
2025-04-01T15:57:34
2025-04-01T15:57:34
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37174", "html_url": "https://github.com/huggingface/transformers/pull/37174", "diff_url": "https://github.com/huggingface/transformers/pull/37174.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37174.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #30718 ## Before submitting - [] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 --> @NielsRogge
{ "login": "zhouksh", "id": 3754366, "node_id": "MDQ6VXNlcjM3NTQzNjY=", "avatar_url": "https://avatars.githubusercontent.com/u/3754366?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhouksh", "html_url": "https://github.com/zhouksh", "followers_url": "https://api.github.com/users/zhouksh/followers", "following_url": "https://api.github.com/users/zhouksh/following{/other_user}", "gists_url": "https://api.github.com/users/zhouksh/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhouksh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhouksh/subscriptions", "organizations_url": "https://api.github.com/users/zhouksh/orgs", "repos_url": "https://api.github.com/users/zhouksh/repos", "events_url": "https://api.github.com/users/zhouksh/events{/privacy}", "received_events_url": "https://api.github.com/users/zhouksh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37174/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37174/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37173
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37173/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37173/comments
https://api.github.com/repos/huggingface/transformers/issues/37173/events
https://github.com/huggingface/transformers/pull/37173
2,963,670,298
PR_kwDOCUB6oc6Q8ibd
37,173
[core] remove `GenerationMixin` inheritance by default in `PreTrainedModel`
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T14:40:05
2025-04-08T15:42:09
2025-04-08T15:42:05
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37173", "html_url": "https://github.com/huggingface/transformers/pull/37173", "diff_url": "https://github.com/huggingface/transformers/pull/37173.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37173.patch", "merged_at": "2025-04-08T15:42:05" }
# What does this PR do? In v4.45, we set in motion the removal of `GenerationMixin` inheritance by default in `PreTrainedModel`, our base model class. You can recap the full list of reasons in the [original PR](https://github.com/huggingface/transformers/pull/33203), but TL;DR removes circular dependencies and make non-generative models more lightweight. This PR is the final step: removes the `GenerationMixin` inheritance. Note that this change is NOT breaking in most contexts: ✅ Loading `generate`-capable `transformers` models (https://github.com/huggingface/transformers/pull/33203 added direct inheritance, https://github.com/huggingface/transformers/issues/36180 added a meta test to ensure we always add `generate` tests to `generate`-capable models) ✅ Loading non `generate`-capable models ✅ Loading `generate`-capable Hub models with `AutoModelXXX` (https://github.com/huggingface/transformers/pull/33203 added the logic to ensure we add `GenerationMixin` even if the original model is missing it, as well as corresponding tests) ❌ Loading `generate`-capable Hub models directly, i.e. not using `AutoModelXXX`, if and only if the model doesn't inherit from `GenerationMixin` (= old implementation). In this case, an informative warning is thrown, suggesting to load the model with `AutoModelXXX`. This warning was present since v4.45. Relevant tests: - `py.test tests/models -k test_generation_tester_mixin_inheritance` - `py.test tests/models/auto/ -k test_custom_model_patched_generation_inheritance` - `py.test tests/utils/test_modeling_utils.py::ModelUtilsTest::test_can_generate` __________________________ After merging, let's keep an eye on issues. Although I think I've got the only breaking case well documented, Hub code is always a wildcard.
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37173/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37173/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37172
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37172/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37172/comments
https://api.github.com/repos/huggingface/transformers/issues/37172/events
https://github.com/huggingface/transformers/issues/37172
2,963,619,408
I_kwDOCUB6oc6wpT5Q
37,172
Since 4.50.0, saving and loading a Whisper model causes an error
{ "login": "bruno-hays", "id": 48770768, "node_id": "MDQ6VXNlcjQ4NzcwNzY4", "avatar_url": "https://avatars.githubusercontent.com/u/48770768?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bruno-hays", "html_url": "https://github.com/bruno-hays", "followers_url": "https://api.github.com/users/bruno-hays/followers", "following_url": "https://api.github.com/users/bruno-hays/following{/other_user}", "gists_url": "https://api.github.com/users/bruno-hays/gists{/gist_id}", "starred_url": "https://api.github.com/users/bruno-hays/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bruno-hays/subscriptions", "organizations_url": "https://api.github.com/users/bruno-hays/orgs", "repos_url": "https://api.github.com/users/bruno-hays/repos", "events_url": "https://api.github.com/users/bruno-hays/events{/privacy}", "received_events_url": "https://api.github.com/users/bruno-hays/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-01T14:21:56
2025-05-22T09:16:39
2025-05-22T09:16:39
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.50.3 - Platform: macOS-14.6.1-arm64-arm-64bit - Python version: 3.12.9 ### Who can help? @eustlb @gante ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ```python import soundfile as sf from transformers import WhisperForConditionalGeneration, WhisperProcessor model = WhisperForConditionalGeneration.from_pretrained("openai/whisper-tiny") # model.generation_config.input_ids = model.generation_config.forced_decoder_ids # model.generation_config.forced_decoder_ids = None processor = WhisperProcessor.from_pretrained("openai/whisper-tiny") model.save_pretrained("saved_tiny") model = WhisperForConditionalGeneration.from_pretrained("saved_tiny") audio, sr = sf.read("test.wav") input_features = processor(audio, sampling_rate=sr, return_tensors="pt").input_features model.generate(input_features=input_features) ``` Raises the following error: ``` ValueError: You have explicitly specified `forced_decoder_ids`. Please remove the `forced_decoder_ids` argument in favour of `input_ids` or `decoder_input_ids` respectively. ``` Un-commenting the two lines fixes the error, but I think this should be done by default. ### Expected behavior After saving and loading a model, we should end up with the same behaviour. When saving the model with transformers >4.50.0, the forced_decoder_ids field should be replaced by input_ids, following the deprecation of said parameter I think I can provide a MR if needed
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37172/reactions", "total_count": 11, "+1": 11, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37172/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37171
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37171/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37171/comments
https://api.github.com/repos/huggingface/transformers/issues/37171/events
https://github.com/huggingface/transformers/issues/37171
2,963,455,732
I_kwDOCUB6oc6wor70
37,171
Add EoMT
{ "login": "NielsRogge", "id": 48327001, "node_id": "MDQ6VXNlcjQ4MzI3MDAx", "avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NielsRogge", "html_url": "https://github.com/NielsRogge", "followers_url": "https://api.github.com/users/NielsRogge/followers", "following_url": "https://api.github.com/users/NielsRogge/following{/other_user}", "gists_url": "https://api.github.com/users/NielsRogge/gists{/gist_id}", "starred_url": "https://api.github.com/users/NielsRogge/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/NielsRogge/subscriptions", "organizations_url": "https://api.github.com/users/NielsRogge/orgs", "repos_url": "https://api.github.com/users/NielsRogge/repos", "events_url": "https://api.github.com/users/NielsRogge/events{/privacy}", "received_events_url": "https://api.github.com/users/NielsRogge/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" }, { "id": 2392046359, "node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue", "name": "Good Second Issue", "color": "dd935a", "default": false, "description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!" } ]
closed
false
null
[]
null
[]
2025-04-01T13:27:16
2025-06-27T12:18:20
2025-06-27T12:18:20
CONTRIBUTOR
null
null
null
null
### Model description It'd be great to make the EoMT model available in the Transformers library, because it does image segmentation (instance, semantic and panoptic) in a much simpler way that [Mask2Former](https://huggingface.co/docs/transformers/main/en/model_doc/maskformer) and [OneFormer](https://huggingface.co/docs/transformers/en/model_doc/oneformer). The latter consist of multiple building blocks (vision backbone, pixel decoder, transformer decoder), making them overall pretty complex and hard to implement. Simplicity allows to inherit the optimizations already done for ViT/BERT to be applied to segmentation as well. Initial thread: https://github.com/tue-mps/eomt/issues/1 ### Open source status - [x] The model implementation is available - [x] The model weights are available ### Provide useful links for the implementation Paper: https://huggingface.co/papers/2503.19108 Code: https://github.com/tue-mps/eomt
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37171/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37171/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37170
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37170/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37170/comments
https://api.github.com/repos/huggingface/transformers/issues/37170/events
https://github.com/huggingface/transformers/pull/37170
2,963,357,757
PR_kwDOCUB6oc6Q7doJ
37,170
Avoid pipeline test failing related to Hub call
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T12:52:16
2025-04-01T16:22:46
2025-04-01T16:22:45
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37170", "html_url": "https://github.com/huggingface/transformers/pull/37170", "diff_url": "https://github.com/huggingface/transformers/pull/37170.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37170.patch", "merged_at": "2025-04-01T16:22:45" }
# What does this PR do? Try to avoid pipeline test failing by (potentially) reducing the number of calls to hub.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37170/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37170/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37169
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37169/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37169/comments
https://api.github.com/repos/huggingface/transformers/issues/37169/events
https://github.com/huggingface/transformers/pull/37169
2,963,195,643
PR_kwDOCUB6oc6Q66Ia
37,169
Add Swin2SR ImageProcessorFast
{ "login": "thisisiron", "id": 23303033, "node_id": "MDQ6VXNlcjIzMzAzMDMz", "avatar_url": "https://avatars.githubusercontent.com/u/23303033?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thisisiron", "html_url": "https://github.com/thisisiron", "followers_url": "https://api.github.com/users/thisisiron/followers", "following_url": "https://api.github.com/users/thisisiron/following{/other_user}", "gists_url": "https://api.github.com/users/thisisiron/gists{/gist_id}", "starred_url": "https://api.github.com/users/thisisiron/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thisisiron/subscriptions", "organizations_url": "https://api.github.com/users/thisisiron/orgs", "repos_url": "https://api.github.com/users/thisisiron/repos", "events_url": "https://api.github.com/users/thisisiron/events{/privacy}", "received_events_url": "https://api.github.com/users/thisisiron/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T11:46:15
2025-07-23T17:40:19
2025-05-07T16:20:16
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37169", "html_url": "https://github.com/huggingface/transformers/pull/37169", "diff_url": "https://github.com/huggingface/transformers/pull/37169.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37169.patch", "merged_at": "2025-05-07T16:20:16" }
# What does this PR do? #36978 Add fast image processor for Swin2SR cc @yonigozlan
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37169/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37169/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37168
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37168/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37168/comments
https://api.github.com/repos/huggingface/transformers/issues/37168/events
https://github.com/huggingface/transformers/pull/37168
2,963,088,429
PR_kwDOCUB6oc6Q6ie6
37,168
Idefics2 Fast Image processor
{ "login": "sushmanthreddy", "id": 73489688, "node_id": "MDQ6VXNlcjczNDg5Njg4", "avatar_url": "https://avatars.githubusercontent.com/u/73489688?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sushmanthreddy", "html_url": "https://github.com/sushmanthreddy", "followers_url": "https://api.github.com/users/sushmanthreddy/followers", "following_url": "https://api.github.com/users/sushmanthreddy/following{/other_user}", "gists_url": "https://api.github.com/users/sushmanthreddy/gists{/gist_id}", "starred_url": "https://api.github.com/users/sushmanthreddy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sushmanthreddy/subscriptions", "organizations_url": "https://api.github.com/users/sushmanthreddy/orgs", "repos_url": "https://api.github.com/users/sushmanthreddy/repos", "events_url": "https://api.github.com/users/sushmanthreddy/events{/privacy}", "received_events_url": "https://api.github.com/users/sushmanthreddy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-04-01T10:59:06
2025-04-07T20:12:24
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37168", "html_url": "https://github.com/huggingface/transformers/pull/37168", "diff_url": "https://github.com/huggingface/transformers/pull/37168.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37168.patch", "merged_at": null }
close #36978
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37168/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37168/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37167
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37167/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37167/comments
https://api.github.com/repos/huggingface/transformers/issues/37167/events
https://github.com/huggingface/transformers/pull/37167
2,963,079,056
PR_kwDOCUB6oc6Q6gW7
37,167
Add performance-optimized version of to_py_obj that avoids redundant …
{ "login": "sniverty", "id": 10280516, "node_id": "MDQ6VXNlcjEwMjgwNTE2", "avatar_url": "https://avatars.githubusercontent.com/u/10280516?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sniverty", "html_url": "https://github.com/sniverty", "followers_url": "https://api.github.com/users/sniverty/followers", "following_url": "https://api.github.com/users/sniverty/following{/other_user}", "gists_url": "https://api.github.com/users/sniverty/gists{/gist_id}", "starred_url": "https://api.github.com/users/sniverty/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sniverty/subscriptions", "organizations_url": "https://api.github.com/users/sniverty/orgs", "repos_url": "https://api.github.com/users/sniverty/repos", "events_url": "https://api.github.com/users/sniverty/events{/privacy}", "received_events_url": "https://api.github.com/users/sniverty/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T10:55:49
2025-04-07T12:52:29
2025-04-07T12:52:04
NONE
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37167", "html_url": "https://github.com/huggingface/transformers/pull/37167", "diff_url": "https://github.com/huggingface/transformers/pull/37167.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37167.patch", "merged_at": null }
…checks and minimizes data transfers # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "sniverty", "id": 10280516, "node_id": "MDQ6VXNlcjEwMjgwNTE2", "avatar_url": "https://avatars.githubusercontent.com/u/10280516?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sniverty", "html_url": "https://github.com/sniverty", "followers_url": "https://api.github.com/users/sniverty/followers", "following_url": "https://api.github.com/users/sniverty/following{/other_user}", "gists_url": "https://api.github.com/users/sniverty/gists{/gist_id}", "starred_url": "https://api.github.com/users/sniverty/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sniverty/subscriptions", "organizations_url": "https://api.github.com/users/sniverty/orgs", "repos_url": "https://api.github.com/users/sniverty/repos", "events_url": "https://api.github.com/users/sniverty/events{/privacy}", "received_events_url": "https://api.github.com/users/sniverty/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37167/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37167/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37166
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37166/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37166/comments
https://api.github.com/repos/huggingface/transformers/issues/37166/events
https://github.com/huggingface/transformers/pull/37166
2,962,911,740
PR_kwDOCUB6oc6Q57fg
37,166
Hugging Face Hub pin to v0.30.0 for Xet
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T09:48:54
2025-04-04T12:58:24
2025-04-04T12:58:22
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37166", "html_url": "https://github.com/huggingface/transformers/pull/37166", "diff_url": "https://github.com/huggingface/transformers/pull/37166.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37166.patch", "merged_at": "2025-04-04T12:58:22" }
Pins `huggingface_hub` to > 0.30.0 so that Xet downloads are faster
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37166/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37166/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37165
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37165/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37165/comments
https://api.github.com/repos/huggingface/transformers/issues/37165/events
https://github.com/huggingface/transformers/issues/37165
2,962,667,045
I_kwDOCUB6oc6wlrYl
37,165
Add QLIP into transformers
{ "login": "lavinal712", "id": 98888959, "node_id": "U_kgDOBeTs_w", "avatar_url": "https://avatars.githubusercontent.com/u/98888959?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lavinal712", "html_url": "https://github.com/lavinal712", "followers_url": "https://api.github.com/users/lavinal712/followers", "following_url": "https://api.github.com/users/lavinal712/following{/other_user}", "gists_url": "https://api.github.com/users/lavinal712/gists{/gist_id}", "starred_url": "https://api.github.com/users/lavinal712/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lavinal712/subscriptions", "organizations_url": "https://api.github.com/users/lavinal712/orgs", "repos_url": "https://api.github.com/users/lavinal712/repos", "events_url": "https://api.github.com/users/lavinal712/events{/privacy}", "received_events_url": "https://api.github.com/users/lavinal712/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-04-01T08:16:20
2025-04-01T16:06:47
null
NONE
null
null
null
null
### Model description Introduction: We introduce Quantized Language-Image Pretraining (QLIP), a visual tokenization method that combines state-of-the-art reconstruction quality with state-of-the-art zero-shot image understanding. QLIP trains a binary-spherical-quantization-based autoencoder with reconstruction and language-image alignment objectives. We are the first to show that the two objectives do not need to be at odds. We balance the two loss terms dynamically during training and show that a two-stage training pipeline effectively mixes the large-batch requirements of image-language pre-training with the memory bottleneck imposed by the reconstruction objective. We validate the effectiveness of QLIP for multimodal understanding and text-conditioned image generation with a single model. Specifically, QLIP serves as a drop-in replacement for the visual encoder for LLaVA and the image tokenizer for LlamaGen with comparable or even better performance. Finally, we demonstrate that QLIP enables a unified mixed-modality auto-regressive model for understanding and generation. ### Open source status - [x] The model implementation is available - [x] The model weights are available ### Provide useful links for the implementation Paper: https://arxiv.org/abs/2502.05178 Code: https://github.com/NVlabs/QLIP
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37165/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/37165/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/37164
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37164/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37164/comments
https://api.github.com/repos/huggingface/transformers/issues/37164/events
https://github.com/huggingface/transformers/pull/37164
2,962,415,389
PR_kwDOCUB6oc6Q4PUU
37,164
Add Fast owlvit Processor
{ "login": "keetrap", "id": 103131112, "node_id": "U_kgDOBiWn6A", "avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keetrap", "html_url": "https://github.com/keetrap", "followers_url": "https://api.github.com/users/keetrap/followers", "following_url": "https://api.github.com/users/keetrap/following{/other_user}", "gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}", "starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keetrap/subscriptions", "organizations_url": "https://api.github.com/users/keetrap/orgs", "repos_url": "https://api.github.com/users/keetrap/repos", "events_url": "https://api.github.com/users/keetrap/events{/privacy}", "received_events_url": "https://api.github.com/users/keetrap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T06:37:05
2025-04-14T15:58:09
2025-04-14T15:58:09
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37164", "html_url": "https://github.com/huggingface/transformers/pull/37164", "diff_url": "https://github.com/huggingface/transformers/pull/37164.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37164.patch", "merged_at": "2025-04-14T15:58:09" }
related #36978 cc @yonigozlan
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37164/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37164/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37163
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37163/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37163/comments
https://api.github.com/repos/huggingface/transformers/issues/37163/events
https://github.com/huggingface/transformers/pull/37163
2,962,338,205
PR_kwDOCUB6oc6Q395f
37,163
Add Optional to types
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T06:08:06
2025-04-04T04:05:23
2025-04-03T15:38:01
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37163", "html_url": "https://github.com/huggingface/transformers/pull/37163", "diff_url": "https://github.com/huggingface/transformers/pull/37163.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37163.patch", "merged_at": "2025-04-03T15:38:01" }
# What does this PR do? Add Optional to more types so that we can enable more RUFF rules on type checking.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37163/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37163/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37162
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37162/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37162/comments
https://api.github.com/repos/huggingface/transformers/issues/37162/events
https://github.com/huggingface/transformers/pull/37162
2,962,141,656
PR_kwDOCUB6oc6Q3TMB
37,162
Fix load of rng state for resuming training from checkpoint
{ "login": "winglian", "id": 381258, "node_id": "MDQ6VXNlcjM4MTI1OA==", "avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/winglian", "html_url": "https://github.com/winglian", "followers_url": "https://api.github.com/users/winglian/followers", "following_url": "https://api.github.com/users/winglian/following{/other_user}", "gists_url": "https://api.github.com/users/winglian/gists{/gist_id}", "starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/winglian/subscriptions", "organizations_url": "https://api.github.com/users/winglian/orgs", "repos_url": "https://api.github.com/users/winglian/repos", "events_url": "https://api.github.com/users/winglian/events{/privacy}", "received_events_url": "https://api.github.com/users/winglian/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T03:41:36
2025-04-24T14:55:48
2025-04-24T14:55:34
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37162", "html_url": "https://github.com/huggingface/transformers/pull/37162", "diff_url": "https://github.com/huggingface/transformers/pull/37162.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37162.patch", "merged_at": "2025-04-24T14:55:34" }
# What does this PR do? Fixes a regression from https://github.com/huggingface/transformers/pull/36991 whereby resuming training from a checkpoint would fail when attempting to load the rng state. The regression only affects torch versions < 2.6.0. see https://github.com/axolotl-ai-cloud/axolotl/actions/runs/14185726834/job/39740935566 <!-- Remove if not applicable --> ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @zach-huggingface and @SunMarc <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37162/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37162/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37161
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37161/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37161/comments
https://api.github.com/repos/huggingface/transformers/issues/37161/events
https://github.com/huggingface/transformers/issues/37161
2,962,021,719
I_kwDOCUB6oc6wjN1X
37,161
Swinv2Model reports an error when using the parameter use_obsolute_embeddings
{ "login": "SCP-KAKA", "id": 27934067, "node_id": "MDQ6VXNlcjI3OTM0MDY3", "avatar_url": "https://avatars.githubusercontent.com/u/27934067?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SCP-KAKA", "html_url": "https://github.com/SCP-KAKA", "followers_url": "https://api.github.com/users/SCP-KAKA/followers", "following_url": "https://api.github.com/users/SCP-KAKA/following{/other_user}", "gists_url": "https://api.github.com/users/SCP-KAKA/gists{/gist_id}", "starred_url": "https://api.github.com/users/SCP-KAKA/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SCP-KAKA/subscriptions", "organizations_url": "https://api.github.com/users/SCP-KAKA/orgs", "repos_url": "https://api.github.com/users/SCP-KAKA/repos", "events_url": "https://api.github.com/users/SCP-KAKA/events{/privacy}", "received_events_url": "https://api.github.com/users/SCP-KAKA/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T02:24:44
2025-05-09T08:03:11
2025-05-09T08:03:11
NONE
null
null
null
null
I want to use `position_imbeddings` when creating the swinv2 model, so I set `use_absolute_embeddings` to True in the config and randomly input a data into the model to report an error: ``` File "/home/software/anaconda2019/envs/TrOCR/lib/python3.8/site-packages/transformers/models/swinv2/modeling_swinv2.py", line 362, in forward embeddings = embeddings + self.position_embeddings RuntimeError: The size of tensor a (9216) must match the size of tensor b (9217) at non-singleton dimension 1 ``` The reason is that in line 291 of the `modeling_Swinv2.py` file, the dimension used to create `position_imbeddings` is `num_ patches+1`, which is different from the dimension of `patch_imbeddings`. There was an error when performing the`+`operation on both(at line 362). Is this a bug? If not, how should I correctly use position_imbeddings in the Swinv2 model. Here is my code: ``` swinv2_config = { "architectures": ["Swinv2TinyLayers3"], "attention_probs_dropout_prob": 0.0, "drop_rate": 0.0, "depths": [2, 2, 6], "drop_path_rate": 0.1, "embed_dim": 96, "encoder_stride": 32, "hidden_act": "gelu", "hidden_dropout_prob": 0.0, "hidden_size": 384, "image_size": 384, "initializer_range": 0.02, "layer_norm_eps": 1e-05, "mlp_ratio": 4.0, "model_type": "swinv2", "num_channels": 3, "num_heads": [3, 6, 12], "num_layers": 3, "patch_size": 4, "qkv_bias": True, "use_absolute_embeddings": True, "window_size": 16, "patch_norm": True, "use_checkpoint": False, "pretrained_window_sizes": [0, 0, 0] } class CustomSwinv2Encoder(Swinv2Model): def __init__(self, config): super().__init__(config) def forward(self, pixel_values): outputs = super().forward(pixel_values) return outputs.last_hidden_state swinv2_config = Swinv2Config.from_dict(swinv2_config) encoder = CustomSwinv2Encoder(swinv2_config) dummy_input = torch.randn(2, 3, 384, 384) out = encoder(dummy_input) print(out.shape) ```
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37161/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37161/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37160
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37160/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37160/comments
https://api.github.com/repos/huggingface/transformers/issues/37160/events
https://github.com/huggingface/transformers/issues/37160
2,962,020,830
I_kwDOCUB6oc6wjNne
37,160
Loading DeepSeek R1 model took extremely long time
{ "login": "Neo9061", "id": 8206465, "node_id": "MDQ6VXNlcjgyMDY0NjU=", "avatar_url": "https://avatars.githubusercontent.com/u/8206465?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Neo9061", "html_url": "https://github.com/Neo9061", "followers_url": "https://api.github.com/users/Neo9061/followers", "following_url": "https://api.github.com/users/Neo9061/following{/other_user}", "gists_url": "https://api.github.com/users/Neo9061/gists{/gist_id}", "starred_url": "https://api.github.com/users/Neo9061/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Neo9061/subscriptions", "organizations_url": "https://api.github.com/users/Neo9061/orgs", "repos_url": "https://api.github.com/users/Neo9061/repos", "events_url": "https://api.github.com/users/Neo9061/events{/privacy}", "received_events_url": "https://api.github.com/users/Neo9061/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3081136536, "node_id": "MDU6TGFiZWwzMDgxMTM2NTM2", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Difficult%20Issue", "name": "Good Difficult Issue", "color": "684CC7", "default": false, "description": "" }, { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-04-01T02:24:00
2025-04-30T12:30:57
2025-04-30T12:30:55
NONE
null
null
null
null
### System Info Following the recent merged PR and [release note](https://github.com/huggingface/transformers/releases/tag/v4.50.3-DeepSeek-3), I try to load the DeepSeek R1 model using code snippet below on single P5EN (8 H200) GPUs. 1. The first issue I have is that it took very long time to load. Estimation is ~10 hours. 2. Then I tried to modify the `config.json` and `model.safetensors.index.json` to try to load first 10 layers including embed_token and lm.head modules. However, it hit following error. The issue is gone if I used DeepSeek [conversion script](https://huggingface.co/deepseek-ai/DeepSeek-V3/tree/main/inference) to convert the checkpoint from FP8 to BF16. ``` Some parameters are on the meta device because they were offloaded to the cpu. The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results. Setting `pad_token_id` to `eos_token_id`:1 for open-end generation. The attention mask is not set and cannot be inferred from input because pad token is same as eos token. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results. Traceback (most recent call last): File "/iofsx/sds3/models/DeepSeekV3/test.py", line 18, in <module> outputs = model.generate(inputs, max_new_tokens=50) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/generation/utils.py", line 2370, in generate result = self._sample( File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/generation/utils.py", line 3331, in _sample outputs = self(**model_inputs, return_dict=True) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/accelerate/hooks.py", line 176, in new_forward output = module._old_forward(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func return func(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/models/deepseek_v3/modeling_deepseek_v3.py", line 1025, in forward outputs = self.model( File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/models/deepseek_v3/modeling_deepseek_v3.py", line 773, in forward layer_outputs = decoder_layer( File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/accelerate/hooks.py", line 176, in new_forward output = module._old_forward(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/models/deepseek_v3/modeling_deepseek_v3.py", line 513, in forward hidden_states, self_attn_weights = self.self_attn( File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/accelerate/hooks.py", line 176, in new_forward output = module._old_forward(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/transformers/models/deepseek_v3/modeling_deepseek_v3.py", line 423, in forward q_states = self.q_b_proj(self.q_a_layernorm(self.q_a_proj(hidden_states))).view(query_shape).transpose(1, 2) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/accelerate/hooks.py", line 176, in new_forward output = module._old_forward(*args, **kwargs) File "/opt/conda/envs/fix/lib/python3.10/site-packages/torch/nn/modules/linear.py", line 125, in forward return F.linear(input, self.weight, self.bias) RuntimeError: expected mat1 and mat2 to have the same dtype, but got: c10::BFloat16 != c10::Float8_e4m3fn ``` ### Who can help? _No response_ ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Code snippet to load and generate text example. ``` # `run_deepseek_v1.py` from transformers import AutoModelForCausalLM, AutoTokenizer import torch torch.manual_seed(30) model_path = "MYMODEL_PATH" tokenizer = AutoTokenizer.from_pretrained(model_path) chat = [ {"role": "user", "content": "Hello, how are you?"}, {"role": "assistant", "content": "I'm doing great. How can I help you today?"}, {"role": "user", "content": "I'd like to show off how chat templating works!"}, ] model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto", torch_dtype=torch.bfloat16) inputs = tokenizer.apply_chat_template(chat, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device) outputs = model.generate(inputs, max_new_tokens=50) print(tokenizer.batch_decode(outputs)) ``` ### Expected behavior NA
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37160/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37160/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37159
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37159/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37159/comments
https://api.github.com/repos/huggingface/transformers/issues/37159/events
https://github.com/huggingface/transformers/pull/37159
2,961,966,204
PR_kwDOCUB6oc6Q2tYr
37,159
Add "selecting a quantization method" doc
{ "login": "DerekLiu35", "id": 91234588, "node_id": "MDQ6VXNlcjkxMjM0NTg4", "avatar_url": "https://avatars.githubusercontent.com/u/91234588?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DerekLiu35", "html_url": "https://github.com/DerekLiu35", "followers_url": "https://api.github.com/users/DerekLiu35/followers", "following_url": "https://api.github.com/users/DerekLiu35/following{/other_user}", "gists_url": "https://api.github.com/users/DerekLiu35/gists{/gist_id}", "starred_url": "https://api.github.com/users/DerekLiu35/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DerekLiu35/subscriptions", "organizations_url": "https://api.github.com/users/DerekLiu35/orgs", "repos_url": "https://api.github.com/users/DerekLiu35/repos", "events_url": "https://api.github.com/users/DerekLiu35/events{/privacy}", "received_events_url": "https://api.github.com/users/DerekLiu35/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-04-01T01:47:56
2025-04-09T13:51:37
2025-04-09T13:51:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37159", "html_url": "https://github.com/huggingface/transformers/pull/37159", "diff_url": "https://github.com/huggingface/transformers/pull/37159.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37159.patch", "merged_at": "2025-04-09T13:51:37" }
# What does this PR do? Modify docs to help users determine which quantization methods to use ## Who can review? @SunMarc @stevhliu
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37159/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37159/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37158
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37158/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37158/comments
https://api.github.com/repos/huggingface/transformers/issues/37158/events
https://github.com/huggingface/transformers/pull/37158
2,961,752,394
PR_kwDOCUB6oc6Q1_gZ
37,158
Add fast image processor for ZoeDepth
{ "login": "samrae7", "id": 4126146, "node_id": "MDQ6VXNlcjQxMjYxNDY=", "avatar_url": "https://avatars.githubusercontent.com/u/4126146?v=4", "gravatar_id": "", "url": "https://api.github.com/users/samrae7", "html_url": "https://github.com/samrae7", "followers_url": "https://api.github.com/users/samrae7/followers", "following_url": "https://api.github.com/users/samrae7/following{/other_user}", "gists_url": "https://api.github.com/users/samrae7/gists{/gist_id}", "starred_url": "https://api.github.com/users/samrae7/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/samrae7/subscriptions", "organizations_url": "https://api.github.com/users/samrae7/orgs", "repos_url": "https://api.github.com/users/samrae7/repos", "events_url": "https://api.github.com/users/samrae7/events{/privacy}", "received_events_url": "https://api.github.com/users/samrae7/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T22:55:22
2025-06-05T00:34:57
2025-06-05T00:34:57
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37158", "html_url": "https://github.com/huggingface/transformers/pull/37158", "diff_url": "https://github.com/huggingface/transformers/pull/37158.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37158.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> attempt to fix # (36978) attempt to add fast image processor for ZoeDepth model https://github.com/huggingface/transformers/issues/36978 ZoeDepth allows parameters leep_aspect_ratio and ensure_multiple_of In order to maintain that functionality I overwrote the base ImageProcessorFast class, but took a wrong turn and now three tests failing including `test_slow_fast_equivalenc` As this was i my first Hugging face issue I'm admitting defeat and will try to find an easier model to adapt 😓 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37158/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37158/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37157
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37157/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37157/comments
https://api.github.com/repos/huggingface/transformers/issues/37157/events
https://github.com/huggingface/transformers/pull/37157
2,961,675,485
PR_kwDOCUB6oc6Q1uqE
37,157
Updated model card for distilbert
{ "login": "ChathuminaVimukthi", "id": 31965817, "node_id": "MDQ6VXNlcjMxOTY1ODE3", "avatar_url": "https://avatars.githubusercontent.com/u/31965817?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ChathuminaVimukthi", "html_url": "https://github.com/ChathuminaVimukthi", "followers_url": "https://api.github.com/users/ChathuminaVimukthi/followers", "following_url": "https://api.github.com/users/ChathuminaVimukthi/following{/other_user}", "gists_url": "https://api.github.com/users/ChathuminaVimukthi/gists{/gist_id}", "starred_url": "https://api.github.com/users/ChathuminaVimukthi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ChathuminaVimukthi/subscriptions", "organizations_url": "https://api.github.com/users/ChathuminaVimukthi/orgs", "repos_url": "https://api.github.com/users/ChathuminaVimukthi/repos", "events_url": "https://api.github.com/users/ChathuminaVimukthi/events{/privacy}", "received_events_url": "https://api.github.com/users/ChathuminaVimukthi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T22:01:41
2025-04-04T22:22:46
2025-04-04T22:22:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37157", "html_url": "https://github.com/huggingface/transformers/pull/37157", "diff_url": "https://github.com/huggingface/transformers/pull/37157.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37157.patch", "merged_at": "2025-04-04T22:22:46" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the DistilBERT model, which will now be aligned with the standardized format for all the docs. ## Who can review? @stevhliu, please let me know if any changes are needed. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37157/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37157/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37156
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37156/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37156/comments
https://api.github.com/repos/huggingface/transformers/issues/37156/events
https://github.com/huggingface/transformers/pull/37156
2,961,474,187
PR_kwDOCUB6oc6Q1DPs
37,156
updated model card for Mistral
{ "login": "NahieliV", "id": 54726691, "node_id": "MDQ6VXNlcjU0NzI2Njkx", "avatar_url": "https://avatars.githubusercontent.com/u/54726691?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NahieliV", "html_url": "https://github.com/NahieliV", "followers_url": "https://api.github.com/users/NahieliV/followers", "following_url": "https://api.github.com/users/NahieliV/following{/other_user}", "gists_url": "https://api.github.com/users/NahieliV/gists{/gist_id}", "starred_url": "https://api.github.com/users/NahieliV/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/NahieliV/subscriptions", "organizations_url": "https://api.github.com/users/NahieliV/orgs", "repos_url": "https://api.github.com/users/NahieliV/repos", "events_url": "https://api.github.com/users/NahieliV/events{/privacy}", "received_events_url": "https://api.github.com/users/NahieliV/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T20:11:53
2025-04-07T17:05:37
2025-04-07T17:05:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37156", "html_url": "https://github.com/huggingface/transformers/pull/37156", "diff_url": "https://github.com/huggingface/transformers/pull/37156.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37156.patch", "merged_at": "2025-04-07T17:05:36" }
This PR updates the model card for Mistral and follows the template outlined in the issue. Please let me know if any changes need to be made. Fixes https://github.com/huggingface/transformers/issues/36979 ### Before submitting - [x] This PR improves the docs. ### Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37156/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37156/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37155
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37155/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37155/comments
https://api.github.com/repos/huggingface/transformers/issues/37155/events
https://github.com/huggingface/transformers/pull/37155
2,961,260,430
PR_kwDOCUB6oc6Q0UnS
37,155
docs: Update LayoutLMv3 model card with standardized format and impro…
{ "login": "carrycooldude", "id": 41143496, "node_id": "MDQ6VXNlcjQxMTQzNDk2", "avatar_url": "https://avatars.githubusercontent.com/u/41143496?v=4", "gravatar_id": "", "url": "https://api.github.com/users/carrycooldude", "html_url": "https://github.com/carrycooldude", "followers_url": "https://api.github.com/users/carrycooldude/followers", "following_url": "https://api.github.com/users/carrycooldude/following{/other_user}", "gists_url": "https://api.github.com/users/carrycooldude/gists{/gist_id}", "starred_url": "https://api.github.com/users/carrycooldude/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/carrycooldude/subscriptions", "organizations_url": "https://api.github.com/users/carrycooldude/orgs", "repos_url": "https://api.github.com/users/carrycooldude/repos", "events_url": "https://api.github.com/users/carrycooldude/events{/privacy}", "received_events_url": "https://api.github.com/users/carrycooldude/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-03-31T18:33:29
2025-10-18T16:27:27
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37155", "html_url": "https://github.com/huggingface/transformers/pull/37155", "diff_url": "https://github.com/huggingface/transformers/pull/37155.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37155.patch", "merged_at": null }
# Update LayoutLMv3 Model Card Documentation This PR updates the LayoutLMv3 model card documentation to follow the standardized format as requested in #36979. The changes improve the documentation's clarity and usability while maintaining consistency with other model cards in the repository. # What does this PR do? This PR enhances the LayoutLMv3 model card documentation by: - Adding badges for framework support (PyTorch, TensorFlow, Flax) and optimizations (Flash Attention, SDPA) - Reorganizing code examples into clear sections: - Quick Start (basic usage) - Pipeline API examples - AutoModel examples - transformers-cli examples - Adding quantization examples for large models (8-bit and 4-bit) - Adding attention visualization examples using AttentionMaskVisualizer - Maintaining existing functionality while improving documentation structure The changes make the documentation more accessible and provide ready-to-use examples for different use cases, following the standardized format used in other model cards like Gemma 3, PaliGemma, and ViT. #36979 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. Since this is a documentation update for a vision-language model, I would suggest tagging: - @amyeroberts (vision models) - @stevhliu (documentation)
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37155/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37155/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37154
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37154/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37154/comments
https://api.github.com/repos/huggingface/transformers/issues/37154/events
https://github.com/huggingface/transformers/pull/37154
2,961,191,333
PR_kwDOCUB6oc6Q0FT_
37,154
Add Fast LeViT Processor
{ "login": "keetrap", "id": 103131112, "node_id": "U_kgDOBiWn6A", "avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keetrap", "html_url": "https://github.com/keetrap", "followers_url": "https://api.github.com/users/keetrap/followers", "following_url": "https://api.github.com/users/keetrap/following{/other_user}", "gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}", "starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keetrap/subscriptions", "organizations_url": "https://api.github.com/users/keetrap/orgs", "repos_url": "https://api.github.com/users/keetrap/repos", "events_url": "https://api.github.com/users/keetrap/events{/privacy}", "received_events_url": "https://api.github.com/users/keetrap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T18:04:21
2025-04-14T15:07:36
2025-04-14T15:07:36
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37154", "html_url": "https://github.com/huggingface/transformers/pull/37154", "diff_url": "https://github.com/huggingface/transformers/pull/37154.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37154.patch", "merged_at": "2025-04-14T15:07:36" }
Related #36978 cc @yonigozlan
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37154/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37154/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37153
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37153/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37153/comments
https://api.github.com/repos/huggingface/transformers/issues/37153/events
https://github.com/huggingface/transformers/pull/37153
2,961,143,053
PR_kwDOCUB6oc6Qz6gG
37,153
Fixes the inconsistency of the optionality of attention_mask
{ "login": "Zephyr271828", "id": 109715540, "node_id": "U_kgDOBoogVA", "avatar_url": "https://avatars.githubusercontent.com/u/109715540?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Zephyr271828", "html_url": "https://github.com/Zephyr271828", "followers_url": "https://api.github.com/users/Zephyr271828/followers", "following_url": "https://api.github.com/users/Zephyr271828/following{/other_user}", "gists_url": "https://api.github.com/users/Zephyr271828/gists{/gist_id}", "starred_url": "https://api.github.com/users/Zephyr271828/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Zephyr271828/subscriptions", "organizations_url": "https://api.github.com/users/Zephyr271828/orgs", "repos_url": "https://api.github.com/users/Zephyr271828/repos", "events_url": "https://api.github.com/users/Zephyr271828/events{/privacy}", "received_events_url": "https://api.github.com/users/Zephyr271828/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T17:44:46
2025-04-01T15:40:29
2025-04-01T14:31:10
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37153", "html_url": "https://github.com/huggingface/transformers/pull/37153", "diff_url": "https://github.com/huggingface/transformers/pull/37153.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37153.patch", "merged_at": "2025-04-01T14:31:10" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> This PR fixes Issue #37046. This issue discusses the inconsistency of the optionality of the parameter "attention_mask" in different functions. Specifically, the type of attention_mask in `LlamaForCausalLM`, `LlamaModel`, and `LlamaDecoderLayer` are `Optional[torch.Tensor] = None`. In [`LlamaAttention`](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/models/llama/modeling_llama.py#L269) (a class which wraps the attention interface), the type is `Optional[torch.Tensor]` because `attention_mask` will be passed to it from `LlamaModel`, whether the type of `attention_mask` is `torch.Tensor` or `NoneType`. Therefore, `attention_mask` will be passed to the attention_interface, no matter it's a tensor or NoneType. The key problem lies in the type specification in flash_attention. In [`flash_attention_forward`](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/integrations/flash_attention.py#L17), the type of `attention_mask` is still `Optional[torch.Tensor]`whereas in [`_flash_attention_forward`](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/modeling_flash_attention_utils.py#L234), a function called in `flash_attention_forward`, the type is `torch.Tensor`, which is unreasonable because: 1. it's inconsistent with the specification in `flash_attention_forward`. 2. `_flash_attention_forward` is usable even if `attention_mask` is None. Therefore, this PR fixes the specification in `_flash_attention_forward` as well as the docstring to address the aforementioned issue. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37153/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37153/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37152
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37152/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37152/comments
https://api.github.com/repos/huggingface/transformers/issues/37152/events
https://github.com/huggingface/transformers/pull/37152
2,961,114,025
PR_kwDOCUB6oc6Qz0IU
37,152
Update Model Card for Jamba
{ "login": "ParagEkbote", "id": 69567729, "node_id": "MDQ6VXNlcjY5NTY3NzI5", "avatar_url": "https://avatars.githubusercontent.com/u/69567729?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParagEkbote", "html_url": "https://github.com/ParagEkbote", "followers_url": "https://api.github.com/users/ParagEkbote/followers", "following_url": "https://api.github.com/users/ParagEkbote/following{/other_user}", "gists_url": "https://api.github.com/users/ParagEkbote/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParagEkbote/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParagEkbote/subscriptions", "organizations_url": "https://api.github.com/users/ParagEkbote/orgs", "repos_url": "https://api.github.com/users/ParagEkbote/repos", "events_url": "https://api.github.com/users/ParagEkbote/events{/privacy}", "received_events_url": "https://api.github.com/users/ParagEkbote/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T17:30:49
2025-04-07T18:05:11
2025-04-07T18:02:59
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37152", "html_url": "https://github.com/huggingface/transformers/pull/37152", "diff_url": "https://github.com/huggingface/transformers/pull/37152.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37152.patch", "merged_at": "2025-04-07T18:02:59" }
# What does this PR do? As described in the issue, this PR updates the model card for Jamba. I've chosen to also add the newer Jamba-Large-1.6 and Mini models in the model card as well and notes for loading the model using accelerate and BnB. Please let me know if any modifications are required and I will make the necessary changes. #36979 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). ## Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37152/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37152/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37151
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37151/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37151/comments
https://api.github.com/repos/huggingface/transformers/issues/37151/events
https://github.com/huggingface/transformers/issues/37151
2,961,066,229
I_kwDOCUB6oc6wfkj1
37,151
Error when using trainer with default data parallelism enabled: RuntimeError: chunk expects at least a 1-dimensional tensor
{ "login": "Mekadrom", "id": 1545535, "node_id": "MDQ6VXNlcjE1NDU1MzU=", "avatar_url": "https://avatars.githubusercontent.com/u/1545535?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Mekadrom", "html_url": "https://github.com/Mekadrom", "followers_url": "https://api.github.com/users/Mekadrom/followers", "following_url": "https://api.github.com/users/Mekadrom/following{/other_user}", "gists_url": "https://api.github.com/users/Mekadrom/gists{/gist_id}", "starred_url": "https://api.github.com/users/Mekadrom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Mekadrom/subscriptions", "organizations_url": "https://api.github.com/users/Mekadrom/orgs", "repos_url": "https://api.github.com/users/Mekadrom/repos", "events_url": "https://api.github.com/users/Mekadrom/events{/privacy}", "received_events_url": "https://api.github.com/users/Mekadrom/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-31T17:07:00
2025-04-03T15:39:18
2025-04-03T15:36:00
NONE
null
null
null
null
### System Info `transformers-cli env` output: - `transformers` version: 4.50.3 - Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.29.3 - Safetensors version: 0.5.3 - Accelerate version: 1.5.2 - Accelerate config: not found - DeepSpeed version: 0.16.4 - PyTorch version (GPU?): 2.6.0+cu124 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: yes, implicitly - Using GPU in script?: yes - GPU type: NVIDIA GeForce RTX 4090 Other pertinent versions: ``` (venv) user@vm:~/dev/projects/project$ python3 -c 'import torch; print(torch.version.cuda)' 12.4 (venv) user@vm:~/dev/projects/project$ nvidia-smi ... | NVIDIA-SMI 550.54.14 Driver Version: 550.54.14 CUDA Version: 12.4 | ... ``` And it doesn't show except in the extended nvidia-smi output, but there are two 4090s on this device and I can tell that the trainer is using them both by default because of a spike in memory usage and GPU% when the script starts. ### Who can help? @zach-huggingface @ArthurZucker ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction First off, the docs bot is not working as of the creation of this issue, so I did not have that as a resource. Full traceback: ``` Traceback (most recent call last): File "/home/user/dev/projects/project/min_repro.py", line 56, in <module> trainer.train() File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/transformers/trainer.py", line 2245, in train return inner_training_loop( File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/transformers/trainer.py", line 2556, in _inner_training_loop tr_loss_step = self.training_step(model, inputs, num_items_in_batch) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/transformers/trainer.py", line 3718, in training_step loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/transformers/trainer.py", line 3783, in compute_loss outputs = model(**inputs) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl return forward_call(*args, **kwargs) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/data_parallel.py", line 183, in forward inputs, module_kwargs = self.scatter(inputs, kwargs, self.device_ids) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/data_parallel.py", line 207, in scatter return scatter_kwargs(inputs, kwargs, device_ids, dim=self.dim) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 89, in scatter_kwargs scattered_kwargs = scatter(kwargs, target_gpus, dim) if kwargs else [] File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 75, in scatter res = scatter_map(inputs) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 66, in scatter_map return [type(obj)(i) for i in zip(*map(scatter_map, obj.items()))] File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 62, in scatter_map return list(zip(*map(scatter_map, obj))) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/scatter_gather.py", line 58, in scatter_map return Scatter.apply(target_gpus, None, dim, obj) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/autograd/function.py", line 575, in apply return super().apply(*args, **kwargs) # type: ignore[misc] File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/_functions.py", line 103, in forward outputs = comm.scatter(input, target_gpus, chunk_sizes, ctx.dim, streams) File "/home/user/dev/projects/project/venv/lib/python3.10/site-packages/torch/nn/parallel/comm.py", line 205, in scatter return tuple(torch._C._scatter(tensor, devices, chunk_sizes, dim, streams)) RuntimeError: chunk expects at least a 1-dimensional tensor ``` Here is the minimal reproduction code that I used to produce this output: ```python from datasets import load_dataset from transformers import AutoConfig, GPT2LMHeadModel, AutoTokenizer, DataCollatorForLanguageModeling, TrainingArguments, Trainer tokenizer = AutoTokenizer.from_pretrained("gpt2") if tokenizer.pad_token is None: tokenizer.pad_token = tokenizer.eos_token config = AutoConfig.from_pretrained( "gpt2", vocab_size=len(tokenizer), n_ctx=512, bos_token_id=tokenizer.bos_token_id, eos_token_id=tokenizer.eos_token_id, ) training_args = TrainingArguments( output_dir="./logs", per_device_train_batch_size=32, per_device_eval_batch_size=32, evaluation_strategy="steps", eval_steps=5_000, logging_steps=5_000, gradient_accumulation_steps=8, num_train_epochs=1, weight_decay=0.1, warmup_steps=1_000, lr_scheduler_type="cosine", learning_rate=5e-4, save_steps=5_000, fp16=True, ) model = GPT2LMHeadModel(config) dataset = load_dataset("wikitext", "wikitext-2-v1", streaming=False) def tokenize_mapping(examples): outputs = tokenizer(examples["text"], truncation=True, max_length=512, return_overflowing_tokens=True, return_length=True) input_batch = [] for length, input_ids in zip(outputs["length"], outputs["input_ids"]): if length == 512: input_batch.append(input_ids) return {"input_ids": input_batch} dataset = dataset.map(tokenize_mapping, batched=True, remove_columns=dataset["train"].column_names) trainer = Trainer( model=model, args=training_args, data_collator=DataCollatorForLanguageModeling(tokenizer=tokenizer, mlm=False), train_dataset=dataset["train"], eval_dataset=dataset["validation"], processing_class=tokenizer, ) print(f"Starting training with {sum(p.numel() for p in model.parameters()):,} parameters") print(f"Model: {model}") print(f"Tokenizer: {tokenizer}") trainer.train() ``` Using [this](https://huggingface.co/learn/nlp-course/en/chapter7/6?fw=pt#training-a-causal-language-model-from-scratch) official doc as a guide. ### Expected behavior I would expect the very slightly modified training script to create and train a gpt2 model on the wikitext-2-v1 dataset. Instead, an exception is raised indicating some issue with the data parallelization integration. Also potentially of note is that I tried loading the pretrained gpt2 model and finetuning it with a modified version of this script (both with and without `device_map='balanced'` in the call to `from_pretrained`) and it still produces this chunking error.
{ "login": "Mekadrom", "id": 1545535, "node_id": "MDQ6VXNlcjE1NDU1MzU=", "avatar_url": "https://avatars.githubusercontent.com/u/1545535?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Mekadrom", "html_url": "https://github.com/Mekadrom", "followers_url": "https://api.github.com/users/Mekadrom/followers", "following_url": "https://api.github.com/users/Mekadrom/following{/other_user}", "gists_url": "https://api.github.com/users/Mekadrom/gists{/gist_id}", "starred_url": "https://api.github.com/users/Mekadrom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Mekadrom/subscriptions", "organizations_url": "https://api.github.com/users/Mekadrom/orgs", "repos_url": "https://api.github.com/users/Mekadrom/repos", "events_url": "https://api.github.com/users/Mekadrom/events{/privacy}", "received_events_url": "https://api.github.com/users/Mekadrom/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37151/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37151/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37150
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37150/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37150/comments
https://api.github.com/repos/huggingface/transformers/issues/37150/events
https://github.com/huggingface/transformers/pull/37150
2,960,991,158
PR_kwDOCUB6oc6QzaJ7
37,150
[New Model Addition] Baichuan M1
{ "login": "Vaibhavs10", "id": 18682411, "node_id": "MDQ6VXNlcjE4NjgyNDEx", "avatar_url": "https://avatars.githubusercontent.com/u/18682411?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Vaibhavs10", "html_url": "https://github.com/Vaibhavs10", "followers_url": "https://api.github.com/users/Vaibhavs10/followers", "following_url": "https://api.github.com/users/Vaibhavs10/following{/other_user}", "gists_url": "https://api.github.com/users/Vaibhavs10/gists{/gist_id}", "starred_url": "https://api.github.com/users/Vaibhavs10/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Vaibhavs10/subscriptions", "organizations_url": "https://api.github.com/users/Vaibhavs10/orgs", "repos_url": "https://api.github.com/users/Vaibhavs10/repos", "events_url": "https://api.github.com/users/Vaibhavs10/events{/privacy}", "received_events_url": "https://api.github.com/users/Vaibhavs10/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-03-31T16:34:05
2025-03-31T16:34:21
null
MEMBER
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37150", "html_url": "https://github.com/huggingface/transformers/pull/37150", "diff_url": "https://github.com/huggingface/transformers/pull/37150.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37150.patch", "merged_at": null }
# What does this PR do? Adds support for Baichaun M1 Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37150/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37150/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37149
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37149/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37149/comments
https://api.github.com/repos/huggingface/transformers/issues/37149/events
https://github.com/huggingface/transformers/pull/37149
2,960,975,542
PR_kwDOCUB6oc6QzWyC
37,149
Remove deprecation warning for `num_logits_to_keep`
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T16:26:32
2025-04-14T17:08:47
2025-04-14T17:08:45
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37149", "html_url": "https://github.com/huggingface/transformers/pull/37149", "diff_url": "https://github.com/huggingface/transformers/pull/37149.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37149.patch", "merged_at": "2025-04-14T17:08:45" }
# What does this PR do?
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37149/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37149/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37148
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37148/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37148/comments
https://api.github.com/repos/huggingface/transformers/issues/37148/events
https://github.com/huggingface/transformers/pull/37148
2,960,962,208
PR_kwDOCUB6oc6QzT3c
37,148
Remove deprecation cycle for `logits_to_keep`
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T16:20:11
2025-03-31T16:22:13
2025-03-31T16:21:47
MEMBER
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37148", "html_url": "https://github.com/huggingface/transformers/pull/37148", "diff_url": "https://github.com/huggingface/transformers/pull/37148.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37148.patch", "merged_at": null }
# What does this PR do?
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37148/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37148/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37147
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37147/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37147/comments
https://api.github.com/repos/huggingface/transformers/issues/37147/events
https://github.com/huggingface/transformers/pull/37147
2,960,937,339
PR_kwDOCUB6oc6QzOsn
37,147
Disable delay_optimizer_creation in `Trainer` to support fsdp2
{ "login": "byi8220", "id": 24833703, "node_id": "MDQ6VXNlcjI0ODMzNzAz", "avatar_url": "https://avatars.githubusercontent.com/u/24833703?v=4", "gravatar_id": "", "url": "https://api.github.com/users/byi8220", "html_url": "https://github.com/byi8220", "followers_url": "https://api.github.com/users/byi8220/followers", "following_url": "https://api.github.com/users/byi8220/following{/other_user}", "gists_url": "https://api.github.com/users/byi8220/gists{/gist_id}", "starred_url": "https://api.github.com/users/byi8220/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/byi8220/subscriptions", "organizations_url": "https://api.github.com/users/byi8220/orgs", "repos_url": "https://api.github.com/users/byi8220/repos", "events_url": "https://api.github.com/users/byi8220/events{/privacy}", "received_events_url": "https://api.github.com/users/byi8220/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T16:08:36
2025-04-04T18:11:37
2025-04-04T18:11:37
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37147", "html_url": "https://github.com/huggingface/transformers/pull/37147", "diff_url": "https://github.com/huggingface/transformers/pull/37147.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37147.patch", "merged_at": "2025-04-04T18:11:37" }
# What does this PR do? In order to get Trainer support for FSDP2 in `accelerate`, we have to pass the model and optimizer into `Accelerator.prepare()` at the same time (https://github.com/huggingface/accelerate/pull/3394/files#r2017637611). **Note:** This may not be sufficient for full FSDP2 support in Trainer. This PR might be scrapped and replaced with something more complete if some issues arise. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [X] Did you write any new necessary tests? ### Testing 2 unit tests were added to `tests/fsdp/test_fsdp.py` 1. `test_fsdp2_cpu_offloading`, which matches `test_fsdp_cpu_offloading`, but is **skipped** since the second test appears to be non functioning 2. `test_accelerate_fsdp2_integration` which matches `test_training_and_can_resume_normally` with SHARDED_STATE_DICT (as this is the only config that can run accelerate), and passes. This will not be caught by CI, as it depends on an unreleased feature, but was instead manually run with the command `RUN_SLOW=1 pytest tests/fsdp/test_fsdp.py ` ``` # RUN_SLOW=1 pytest tests/fsdp/test_fsdp.py tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_full_shard_bf16 PASSED [ 4%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_full_shard_fp16 PASSED [ 8%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_shard_grad_op_bf16 PASSED [ 13%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_shard_grad_op_fp16 PASSED [ 17%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_cpu_offload_0_bf16 PASSED [ 21%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_cpu_offload_1_fp16 PASSED [ 26%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_gradient_accumulation_full_shard_bf16 PASSED [ 30%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_gradient_accumulation_full_shard_fp16 PASSED [ 34%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_gradient_accumulation_shard_grad_op_bf16 PASSED [ 39%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_basic_run_with_gradient_accumulation_shard_grad_op_fp16 PASSED [ 43%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_cpu_offloading SKIPPED (FSDP CPU offloading script not found!) [ 47%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_training_and_can_resume_normally_FULL_STATE_DICT PASSED [ 52%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_training_and_can_resume_normally_SHARDED_STATE_DICT PASSED [ 56%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_accelerate_fsdp2_integration PASSED [ 60%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp2_cpu_offloading SKIPPED (FSDP 2 CPU offloading script not found!) [ 65%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_full_shard_bf16 PASSED [ 69%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_full_shard_fp16 PASSED [ 73%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_shard_grad_op_bf16 PASSED [ 78%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_shard_grad_op_fp16 PASSED [ 82%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_transformers_auto_wrap_full_shard_bf16 PASSED [ 86%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_transformers_auto_wrap_full_shard_fp16 PASSED [ 91%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_transformers_auto_wrap_shard_grad_op_bf16 PASSED [ 95%] tests/fsdp/test_fsdp.py::TrainerIntegrationFSDP::test_fsdp_config_transformers_auto_wrap_shard_grad_op_fp16 PASSED [100%] ``` ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37147/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37147/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37146
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37146/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37146/comments
https://api.github.com/repos/huggingface/transformers/issues/37146/events
https://github.com/huggingface/transformers/pull/37146
2,960,704,202
PR_kwDOCUB6oc6QycM1
37,146
[chat-template] fix video loading
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T14:38:59
2025-04-02T09:27:50
2025-04-02T09:27:50
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37146", "html_url": "https://github.com/huggingface/transformers/pull/37146", "diff_url": "https://github.com/huggingface/transformers/pull/37146.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37146.patch", "merged_at": "2025-04-02T09:27:50" }
# What does this PR do? The video loading logic was broken after the last feature of `load_audio_from_video`. The condition should not be applied to loading videos, but only to audio from video. The CI didn't catch anything because we don't have `pyav` in general dependencies The fix is to apply `if/else` to only small block when audio is loaded
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37146/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37146/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37145
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37145/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37145/comments
https://api.github.com/repos/huggingface/transformers/issues/37145/events
https://github.com/huggingface/transformers/pull/37145
2,960,669,067
PR_kwDOCUB6oc6QyUqj
37,145
🌐 [i18n-KO] Translated `siglip.md` to Korean
{ "login": "devxaitist", "id": 65713225, "node_id": "MDQ6VXNlcjY1NzEzMjI1", "avatar_url": "https://avatars.githubusercontent.com/u/65713225?v=4", "gravatar_id": "", "url": "https://api.github.com/users/devxaitist", "html_url": "https://github.com/devxaitist", "followers_url": "https://api.github.com/users/devxaitist/followers", "following_url": "https://api.github.com/users/devxaitist/following{/other_user}", "gists_url": "https://api.github.com/users/devxaitist/gists{/gist_id}", "starred_url": "https://api.github.com/users/devxaitist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/devxaitist/subscriptions", "organizations_url": "https://api.github.com/users/devxaitist/orgs", "repos_url": "https://api.github.com/users/devxaitist/repos", "events_url": "https://api.github.com/users/devxaitist/events{/privacy}", "received_events_url": "https://api.github.com/users/devxaitist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T14:26:19
2025-04-22T19:23:19
2025-04-22T19:23:19
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37145", "html_url": "https://github.com/huggingface/transformers/pull/37145", "diff_url": "https://github.com/huggingface/transformers/pull/37145.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37145.patch", "merged_at": "2025-04-22T19:23:19" }
# What does this PR do? Translated the `siglip.md` file of the documentation to Korean. Thank you in advance for your review. Part of https://github.com/huggingface/transformers/issues/20179 ## Before reviewing - [x] Check for missing / redundant translations (번역 누락/중복 검사) - [x] Grammar Check (맞춤법 검사) - [x] Review or Add new terms to glossary (용어 확인 및 추가) - [x] Check Inline TOC (e.g. `[[lowercased-header]]`) - [x] Check live-preview for gotchas (live-preview로 정상작동 확인) ## Who can review? (Initial) <!-- 1. 위 체크가 모두 완료된 뒤에만 본인의 조 팀원들에게 리뷰 요청하는 아래 주석을 노출해주세요!--> May you please review this PR? @cjfghk5697, @yijun-lee, @rlaalsrl0922 , @MinJu-Ha ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? (Final) <!-- 2. N조 팀원들과 리뷰가 끝난 후에 아래 주석을 노출해주세요! --> <!-- @stevhliu May you please review this PR? -->
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37145/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37145/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37144
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37144/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37144/comments
https://api.github.com/repos/huggingface/transformers/issues/37144/events
https://github.com/huggingface/transformers/pull/37144
2,960,612,732
PR_kwDOCUB6oc6QyITk
37,144
No more dtype_byte_size()
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T14:06:49
2025-04-02T13:58:39
2025-04-02T13:58:38
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37144", "html_url": "https://github.com/huggingface/transformers/pull/37144", "diff_url": "https://github.com/huggingface/transformers/pull/37144.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37144.patch", "merged_at": "2025-04-02T13:58:38" }
Our code used a `dtype_byte_size()` function, but this was unnecessary and also incorrect. In particular, it listed the byte size for `torch.bool` as `1/8`, which is not actually correct; `bool` is implemented internally in Torch/Numpy/TF/Flax as an 8-bit value. This PR just removes the function entirely and uses methods like `element_size()` instead, which are simpler and more correct. Fixes #37074
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37144/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37144/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37143
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37143/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37143/comments
https://api.github.com/repos/huggingface/transformers/issues/37143/events
https://github.com/huggingface/transformers/pull/37143
2,960,607,850
PR_kwDOCUB6oc6QyHRh
37,143
Add Fast Image Processor for mobileViT
{ "login": "MinJu-Ha", "id": 101788861, "node_id": "U_kgDOBhEsvQ", "avatar_url": "https://avatars.githubusercontent.com/u/101788861?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MinJu-Ha", "html_url": "https://github.com/MinJu-Ha", "followers_url": "https://api.github.com/users/MinJu-Ha/followers", "following_url": "https://api.github.com/users/MinJu-Ha/following{/other_user}", "gists_url": "https://api.github.com/users/MinJu-Ha/gists{/gist_id}", "starred_url": "https://api.github.com/users/MinJu-Ha/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MinJu-Ha/subscriptions", "organizations_url": "https://api.github.com/users/MinJu-Ha/orgs", "repos_url": "https://api.github.com/users/MinJu-Ha/repos", "events_url": "https://api.github.com/users/MinJu-Ha/events{/privacy}", "received_events_url": "https://api.github.com/users/MinJu-Ha/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T14:05:07
2025-06-27T14:40:25
2025-06-27T14:40:24
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37143", "html_url": "https://github.com/huggingface/transformers/pull/37143", "diff_url": "https://github.com/huggingface/transformers/pull/37143.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37143.patch", "merged_at": "2025-06-27T14:40:24" }
related to #36978 cc: @yonigozlan I added Fast image processor for mobileViT and I noticed a noticeable difference between the outputs after preprocessing. Here’s the code I used to compare them: ``` diff = (encoding_slow.pixel_values - encoding_fast.pixel_values).abs() print(f"\n📊 Difference statistics:") print(f" Max difference: {diff.max().item():.10f}") print(f" Mean difference: {diff.mean().item():.10f}") print(f" Slow min/max: {encoding_slow.pixel_values.min().item():.10f} ~ {encoding_slow.pixel_values.max().item():.10f}") print(f" Fast min/max: {encoding_fast.pixel_values.min().item():.10f} ~ {encoding_fast.pixel_values.max().item():.10f}") print(f"Slow implementation dtype: {encoding_slow.pixel_values.dtype}") print(f"Fast implementation dtype: {encoding_fast.pixel_values.dtype}") ``` results: 📊 Difference statistics: Max difference: 0.3411765397 Mean difference: 0.1117687449 Slow min/max: 0.0313725509 ~ 0.9764705896 Fast min/max: 0.0313725509 ~ 0.9764706492 Slow implementation dtype: torch.float32 Fast implementation dtype: torch.float32 ``` Even though the size configs look the same ({'shortest_edge': 20}), and both use torch.float32, the output difference seems quite significant for a slow/fast equivalence test.
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37143/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37143/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37142
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37142/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37142/comments
https://api.github.com/repos/huggingface/transformers/issues/37142/events
https://github.com/huggingface/transformers/pull/37142
2,960,562,414
PR_kwDOCUB6oc6Qx9aD
37,142
[qwen3] fix generation tests
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T13:48:19
2025-03-31T14:55:08
2025-03-31T14:33:42
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37142", "html_url": "https://github.com/huggingface/transformers/pull/37142", "diff_url": "https://github.com/huggingface/transformers/pull/37142.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37142.patch", "merged_at": "2025-03-31T14:33:42" }
# What does this PR do?
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37142/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37142/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37141
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37141/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37141/comments
https://api.github.com/repos/huggingface/transformers/issues/37141/events
https://github.com/huggingface/transformers/pull/37141
2,960,533,766
PR_kwDOCUB6oc6Qx3K8
37,141
skip
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T13:37:27
2025-03-31T14:03:11
2025-03-31T13:38:40
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37141", "html_url": "https://github.com/huggingface/transformers/pull/37141", "diff_url": "https://github.com/huggingface/transformers/pull/37141.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37141.patch", "merged_at": "2025-03-31T13:38:40" }
# What does this PR do? skippppppppppppppp for now
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37141/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37141/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37140
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37140/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37140/comments
https://api.github.com/repos/huggingface/transformers/issues/37140/events
https://github.com/huggingface/transformers/pull/37140
2,960,531,333
PR_kwDOCUB6oc6Qx2ok
37,140
Add Fast Image Processor for Chameleon
{ "login": "farrosalferro", "id": 127369839, "node_id": "U_kgDOB5eCbw", "avatar_url": "https://avatars.githubusercontent.com/u/127369839?v=4", "gravatar_id": "", "url": "https://api.github.com/users/farrosalferro", "html_url": "https://github.com/farrosalferro", "followers_url": "https://api.github.com/users/farrosalferro/followers", "following_url": "https://api.github.com/users/farrosalferro/following{/other_user}", "gists_url": "https://api.github.com/users/farrosalferro/gists{/gist_id}", "starred_url": "https://api.github.com/users/farrosalferro/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/farrosalferro/subscriptions", "organizations_url": "https://api.github.com/users/farrosalferro/orgs", "repos_url": "https://api.github.com/users/farrosalferro/repos", "events_url": "https://api.github.com/users/farrosalferro/events{/privacy}", "received_events_url": "https://api.github.com/users/farrosalferro/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T13:36:40
2025-06-27T15:26:58
2025-06-27T15:26:57
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37140", "html_url": "https://github.com/huggingface/transformers/pull/37140", "diff_url": "https://github.com/huggingface/transformers/pull/37140.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37140.patch", "merged_at": "2025-06-27T15:26:57" }
# What does this PR do? Add Fast Image Processor for Chameleon (Issue [#36978](https://github.com/huggingface/transformers/issues/36978) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @yonigozlan Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. ## Note I found a problem during the resizing operation as it seems the `InterpolationMode.LANCZOS` resampling, which is the default resampling for Chameleon, is not supported within the torchvision resizing method. For instance, when running this code: ``` tensor = torch.randn(4, 3, 224, 224) interpolation = PIL.Image.LANCZOS size = (112, 112) antialias = True new_tensor = F.resize( tensor, size=size, interpolation=interpolation, antialias=antialias, ) ``` I got this error `NotImplementedError: Input Error: Only 3D, 4D and 5D input Tensors supported (got 4D) for the modes: nearest | linear | bilinear | bicubic | trilinear | area | nearest-exact (got lanczos)` My workaround is to transform it to Numpy array first, then apply the resizing function similar to the Chameleon's slow image processor. However, this results in a longer processing time than the slow image processor, which does not pass the `test_fast_is_faster_than_slow` test. Do you have any suggestion on how to solve this problem? Any advice would be appreciated. Thank you.
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37140/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37140/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37139
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37139/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37139/comments
https://api.github.com/repos/huggingface/transformers/issues/37139/events
https://github.com/huggingface/transformers/pull/37139
2,960,459,054
PR_kwDOCUB6oc6Qxmta
37,139
convert float for yarn related arguments in rope_scaling
{ "login": "bzantium", "id": 19511788, "node_id": "MDQ6VXNlcjE5NTExNzg4", "avatar_url": "https://avatars.githubusercontent.com/u/19511788?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bzantium", "html_url": "https://github.com/bzantium", "followers_url": "https://api.github.com/users/bzantium/followers", "following_url": "https://api.github.com/users/bzantium/following{/other_user}", "gists_url": "https://api.github.com/users/bzantium/gists{/gist_id}", "starred_url": "https://api.github.com/users/bzantium/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bzantium/subscriptions", "organizations_url": "https://api.github.com/users/bzantium/orgs", "repos_url": "https://api.github.com/users/bzantium/repos", "events_url": "https://api.github.com/users/bzantium/events{/privacy}", "received_events_url": "https://api.github.com/users/bzantium/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T13:09:57
2025-04-08T11:58:23
2025-04-08T11:58:22
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37139", "html_url": "https://github.com/huggingface/transformers/pull/37139", "diff_url": "https://github.com/huggingface/transformers/pull/37139.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37139.patch", "merged_at": "2025-04-08T11:58:22" }
# What does this PR do? Remove rope related warnings when loading DeepSeek-V3 checkpoints. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #37134 ## Who can review? @ArthurZucker @Rocketknight1
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37139/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37139/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37138
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37138/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37138/comments
https://api.github.com/repos/huggingface/transformers/issues/37138/events
https://github.com/huggingface/transformers/pull/37138
2,960,434,286
PR_kwDOCUB6oc6QxhNC
37,138
[don't merge] debug audio pipeline
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T13:01:26
2025-03-31T13:58:36
2025-03-31T13:58:36
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37138", "html_url": "https://github.com/huggingface/transformers/pull/37138", "diff_url": "https://github.com/huggingface/transformers/pull/37138.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37138.patch", "merged_at": null }
# What does this PR do? debug
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37138/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37138/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37137
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37137/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37137/comments
https://api.github.com/repos/huggingface/transformers/issues/37137/events
https://github.com/huggingface/transformers/pull/37137
2,960,420,937
PR_kwDOCUB6oc6QxeS3
37,137
bump Torch 2.1 with broken compatibility `torch.compile`
{ "login": "Borda", "id": 6035284, "node_id": "MDQ6VXNlcjYwMzUyODQ=", "avatar_url": "https://avatars.githubusercontent.com/u/6035284?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Borda", "html_url": "https://github.com/Borda", "followers_url": "https://api.github.com/users/Borda/followers", "following_url": "https://api.github.com/users/Borda/following{/other_user}", "gists_url": "https://api.github.com/users/Borda/gists{/gist_id}", "starred_url": "https://api.github.com/users/Borda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Borda/subscriptions", "organizations_url": "https://api.github.com/users/Borda/orgs", "repos_url": "https://api.github.com/users/Borda/repos", "events_url": "https://api.github.com/users/Borda/events{/privacy}", "received_events_url": "https://api.github.com/users/Borda/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T12:55:47
2025-04-10T19:16:38
2025-04-04T13:47:26
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37137", "html_url": "https://github.com/huggingface/transformers/pull/37137", "diff_url": "https://github.com/huggingface/transformers/pull/37137.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37137.patch", "merged_at": null }
# What does this PR do? This PR updates the minimum required version of Torch to 2.1 due to an issue with `torch.compile` in 2.0. The problem was identified through Dependabot in [#3027](https://github.com/Lightning-AI/torchmetrics/pull/3027), where all versions passed except for 2.0. Fixes #37238 Issue Details When running tests with Torch 2.0, we encountered the following error: ```py @torch.compiler.disable(recursive=False) AttributeError: module 'torch' has no attribute 'compiler' ``` This happens because `torch.compiler` does not exist in 2.0, causing failures in dependencies like transformers. **Changes** - Set the minimum Torch version to 2.1 to ensure compatibility with `torch.compile`. - Updated relevant dependencies accordingly. **Impact** - This update ensures that `torch.compile` is available and avoids compatibility issues. - Users running Torch 2.0 will need to upgrade to 2.1 or higher. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. well, this is eventually affecting multiple domains... Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37137/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37137/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37136
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37136/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37136/comments
https://api.github.com/repos/huggingface/transformers/issues/37136/events
https://github.com/huggingface/transformers/pull/37136
2,960,364,835
PR_kwDOCUB6oc6QxSA1
37,136
Fix meta state dict loading with quantizers
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T12:31:57
2025-04-01T16:46:01
2025-04-01T16:45:59
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37136", "html_url": "https://github.com/huggingface/transformers/pull/37136", "diff_url": "https://github.com/huggingface/transformers/pull/37136.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37136.patch", "merged_at": "2025-04-01T16:45:59" }
# What does this PR do? This is a temporary fix, while seeing if we can refactor hqq and b&b 4bits to work without the full state dict but only a given param one at a time.
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37136/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37136/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37135
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37135/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37135/comments
https://api.github.com/repos/huggingface/transformers/issues/37135/events
https://github.com/huggingface/transformers/pull/37135
2,960,267,436
PR_kwDOCUB6oc6Qw8vA
37,135
Add Fast Image Processor for Flava
{ "login": "rootonchair", "id": 23548268, "node_id": "MDQ6VXNlcjIzNTQ4MjY4", "avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rootonchair", "html_url": "https://github.com/rootonchair", "followers_url": "https://api.github.com/users/rootonchair/followers", "following_url": "https://api.github.com/users/rootonchair/following{/other_user}", "gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}", "starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions", "organizations_url": "https://api.github.com/users/rootonchair/orgs", "repos_url": "https://api.github.com/users/rootonchair/repos", "events_url": "https://api.github.com/users/rootonchair/events{/privacy}", "received_events_url": "https://api.github.com/users/rootonchair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T11:46:25
2025-04-15T18:02:16
2025-04-14T13:05:31
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37135", "html_url": "https://github.com/huggingface/transformers/pull/37135", "diff_url": "https://github.com/huggingface/transformers/pull/37135.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37135.patch", "merged_at": "2025-04-14T13:05:31" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Related #36978 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37135/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37135/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37134
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37134/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37134/comments
https://api.github.com/repos/huggingface/transformers/issues/37134/events
https://github.com/huggingface/transformers/issues/37134
2,960,194,623
I_kwDOCUB6oc6wcPw_
37,134
Warnings when loading Deepseek-V3 without custom code
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-31T11:09:11
2025-04-08T11:58:23
2025-04-08T11:58:23
MEMBER
null
null
null
null
### System Info I tried loading Deepseek-V3 with the library code following #35926 but I got the following errors when loading the config: ``` `rope_scaling`'s factor field must be a float >= 1, got 40 `rope_scaling`'s beta_fast field must be a float, got 32 `rope_scaling`'s beta_slow field must be a float, got 1 ``` I don't get these errors when loading with `trust_remote_code=True`. Are we sure the RoPE implementation in Transformers matches the original model?
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37134/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37134/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37133
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37133/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37133/comments
https://api.github.com/repos/huggingface/transformers/issues/37133/events
https://github.com/huggingface/transformers/pull/37133
2,960,076,311
PR_kwDOCUB6oc6QwSxS
37,133
Allow quantizers to work with a state dict on meta device
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T10:15:46
2025-03-31T16:11:12
2025-03-31T16:11:12
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37133", "html_url": "https://github.com/huggingface/transformers/pull/37133", "diff_url": "https://github.com/huggingface/transformers/pull/37133.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37133.patch", "merged_at": null }
# What does this PR do? Following https://github.com/huggingface/transformers/pull/35926 and https://github.com/huggingface/transformers/pull/37086, make the quantizers work with a state dict on meta device. See https://github.com/huggingface/transformers/pull/37086 for the issue at hand Currently, will lead to inconsistencies and params initialized with random weights cc @SunMarc @MekkCyber is there any way to replace params with only `param_value` instead of doing it for a whole layer at once with the `state_dict` for hqq and bnb 4bits? I believe this should be possible, but would require to know a bit more about internals of each lib for me. Otherwise if truly not possible we can add an exception for these 2 types of quantizers, but it would be so much simpler if we can simply standardize and just replace one param after the other, instead of doing a full layer at once
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37133/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37133/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37132
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37132/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37132/comments
https://api.github.com/repos/huggingface/transformers/issues/37132/events
https://github.com/huggingface/transformers/pull/37132
2,960,070,707
PR_kwDOCUB6oc6QwRiW
37,132
Add XPU case to is_torch_bf16_gpu_available
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T10:13:28
2025-04-11T16:48:21
2025-04-11T16:28:47
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37132", "html_url": "https://github.com/huggingface/transformers/pull/37132", "diff_url": "https://github.com/huggingface/transformers/pull/37132.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37132.patch", "merged_at": "2025-04-11T16:28:47" }
torch.xpu also provides `is_bf16_supported` function.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37132/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37132/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37131
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37131/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37131/comments
https://api.github.com/repos/huggingface/transformers/issues/37131/events
https://github.com/huggingface/transformers/pull/37131
2,960,019,439
PR_kwDOCUB6oc6QwGRr
37,131
Remove deprecated use_flash_attention_2 parameter
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T09:51:22
2025-06-02T09:26:59
2025-06-02T09:06:25
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37131", "html_url": "https://github.com/huggingface/transformers/pull/37131", "diff_url": "https://github.com/huggingface/transformers/pull/37131.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37131.patch", "merged_at": "2025-06-02T09:06:25" }
Remove use_flash_attention_2
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37131/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37131/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37130
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37130/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37130/comments
https://api.github.com/repos/huggingface/transformers/issues/37130/events
https://github.com/huggingface/transformers/pull/37130
2,960,015,141
PR_kwDOCUB6oc6QwFUm
37,130
Fix llava xpu tests.
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T09:49:50
2025-07-02T05:22:50
2025-04-01T09:10:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37130", "html_url": "https://github.com/huggingface/transformers/pull/37130", "diff_url": "https://github.com/huggingface/transformers/pull/37130.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37130.patch", "merged_at": "2025-04-01T09:10:13" }
To reproduce: `TRANSFORMERS_TEST_DEVICE=xpu ZE_AFFINITY_MASK=0 RUN_SLOW=1 pytest -rA tests/models/llava/test_modeling_llava.py::LlavaForConditionalGenerationIntegrationTest::test_pixtral_4bit` Hi @SunMarc . This PR fixed llava tests on XPU because the output slightly differs from cuda. Please review it. Thanks!
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37130/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37130/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37129
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37129/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37129/comments
https://api.github.com/repos/huggingface/transformers/issues/37129/events
https://github.com/huggingface/transformers/issues/37129
2,959,927,073
I_kwDOCUB6oc6wbOch
37,129
Whether transformers Trainer support pipeline parallelism?
{ "login": "liuheng0111", "id": 18352727, "node_id": "MDQ6VXNlcjE4MzUyNzI3", "avatar_url": "https://avatars.githubusercontent.com/u/18352727?v=4", "gravatar_id": "", "url": "https://api.github.com/users/liuheng0111", "html_url": "https://github.com/liuheng0111", "followers_url": "https://api.github.com/users/liuheng0111/followers", "following_url": "https://api.github.com/users/liuheng0111/following{/other_user}", "gists_url": "https://api.github.com/users/liuheng0111/gists{/gist_id}", "starred_url": "https://api.github.com/users/liuheng0111/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/liuheng0111/subscriptions", "organizations_url": "https://api.github.com/users/liuheng0111/orgs", "repos_url": "https://api.github.com/users/liuheng0111/repos", "events_url": "https://api.github.com/users/liuheng0111/events{/privacy}", "received_events_url": "https://api.github.com/users/liuheng0111/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T09:17:18
2025-05-11T08:03:19
2025-05-11T08:03:18
NONE
null
null
null
null
Hi, deepspeed support pipeline parallelism and tensor parallelism, i want to know whether transformers Trainer support pipeline parallelism? Is there any reference code for using pipeline parallelism to train large language models?
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37129/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37129/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37128
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37128/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37128/comments
https://api.github.com/repos/huggingface/transformers/issues/37128/events
https://github.com/huggingface/transformers/issues/37128
2,959,791,802
I_kwDOCUB6oc6wata6
37,128
TFAutoModelForSequenceClassification support for ModernBertConfig
{ "login": "nkaccounting", "id": 39125861, "node_id": "MDQ6VXNlcjM5MTI1ODYx", "avatar_url": "https://avatars.githubusercontent.com/u/39125861?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nkaccounting", "html_url": "https://github.com/nkaccounting", "followers_url": "https://api.github.com/users/nkaccounting/followers", "following_url": "https://api.github.com/users/nkaccounting/following{/other_user}", "gists_url": "https://api.github.com/users/nkaccounting/gists{/gist_id}", "starred_url": "https://api.github.com/users/nkaccounting/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nkaccounting/subscriptions", "organizations_url": "https://api.github.com/users/nkaccounting/orgs", "repos_url": "https://api.github.com/users/nkaccounting/repos", "events_url": "https://api.github.com/users/nkaccounting/events{/privacy}", "received_events_url": "https://api.github.com/users/nkaccounting/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-31T08:20:25
2025-03-31T13:26:22
2025-03-31T13:26:21
NONE
null
null
null
null
### System Info - `transformers` version: 4.50.3 - Platform: Linux-5.15.0-91-generic-x86_64-with-glibc2.35 - Python version: 3.11.10 - Huggingface_hub version: 0.26.2 - Safetensors version: 0.4.5 - Accelerate version: 1.0.1 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type: MULTI_GPU - mixed_precision: no - use_cpu: False - debug: False - num_processes: 4 - machine_rank: 0 - num_machines: 1 - rdzv_backend: static - same_network: False - main_training_function: main - enable_cpu_affinity: False - downcast_bf16: False - tpu_use_cluster: False - tpu_use_sudo: False - DeepSpeed version: not installed - PyTorch version (GPU?): 2.5.1+cu124 (True) - Tensorflow version (GPU?): 2.18.0 (True) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: <fill in> - Using GPU in script?: <fill in> - GPU type: NVIDIA A100-SXM4-80GB ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction I want to convert modern BERT from PyTorch version to TensorFlow version. My code is below: from transformers import TFAutoModelForSequenceClassification import tensorflow as tf in_path = "" out_path = "" with tf.device('/GPU:0'): tf_model = TFAutoModelForSequenceClassification.from_pretrained(in_path, from_pt=True) tf.keras.models.save_model(tf_model, out_path) I get the result : ValueError: Unrecognized configuration class <class 'transformers.models.modernbert.configuration_modernbert.ModernBertConfig'> for this kind of AutoModel: TFAutoModelForSequenceClassification. Model type should be one of AlbertConfig, BartConfig, BertConfig, CamembertConfig, ConvBertConfig, CTRLConfig, DebertaConfig, DebertaV2Config, DistilBertConfig, ElectraConfig, EsmConfig, FlaubertConfig, FunnelConfig, GPT2Config, GPT2Config, GPTJConfig, LayoutLMConfig, LayoutLMv3Config, LongformerConfig, MistralConfig, MobileBertConfig, MPNetConfig, OpenAIGPTConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoFormerConfig, TapasConfig, TransfoXLConfig, XLMConfig, XLMRobertaConfig, XLNetConfig. ### Expected behavior All PyTorch model weights were used when initializing TFAutoModelForSequenceClassification. All the weights of TFAutoModelForSequenceClassification were initialized from the PyTorch model. If your task is similar to the task the model of the checkpoint was trained on, you can already use TFBertForSequenceClassification for predictions without further training.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37128/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37128/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37127
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37127/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37127/comments
https://api.github.com/repos/huggingface/transformers/issues/37127/events
https://github.com/huggingface/transformers/pull/37127
2,959,533,370
PR_kwDOCUB6oc6Qufib
37,127
refactor(audio_processing): replace pipe with temp files for FFmpeg p…
{ "login": "joeyhacker", "id": 2774637, "node_id": "MDQ6VXNlcjI3NzQ2Mzc=", "avatar_url": "https://avatars.githubusercontent.com/u/2774637?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joeyhacker", "html_url": "https://github.com/joeyhacker", "followers_url": "https://api.github.com/users/joeyhacker/followers", "following_url": "https://api.github.com/users/joeyhacker/following{/other_user}", "gists_url": "https://api.github.com/users/joeyhacker/gists{/gist_id}", "starred_url": "https://api.github.com/users/joeyhacker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/joeyhacker/subscriptions", "organizations_url": "https://api.github.com/users/joeyhacker/orgs", "repos_url": "https://api.github.com/users/joeyhacker/repos", "events_url": "https://api.github.com/users/joeyhacker/events{/privacy}", "received_events_url": "https://api.github.com/users/joeyhacker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-03-31T05:50:52
2025-03-31T13:24:38
null
NONE
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37127", "html_url": "https://github.com/huggingface/transformers/pull/37127", "diff_url": "https://github.com/huggingface/transformers/pull/37127.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37127.patch", "merged_at": null }
…rocessing This change replaces the previous pipe-based FFmpeg audio decoding approach with temporary file operations to improve reliability, especially for container formats like MP4/m4a. Key reasons for this change: 1. MP4 container format requires random access to metadata (moov atoms) which is often located at the end of the file. Pipe streaming makes this access pattern impossible. 2. FFmpeg's format detection works more reliably with physical files compared to streamed input via pipes. 3. Temporary files provide better error diagnostics since the input can be preserved for debugging when failures occur. 4. Some FFmpeg codecs and filters behave differently with pipe input versus file input due to buffering differences. The new implementation: - Creates properly named temporary files with correct extensions - Uses atomic write operations with flush() - Implements comprehensive cleanup in finally blocks - Provides better error messages when failures occur This fixes issues with partial file errors ("offset 0x3f9: partial file") that occurred during demuxing of m4a files in the pipe-based approach # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37127/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37127/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37126
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37126/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37126/comments
https://api.github.com/repos/huggingface/transformers/issues/37126/events
https://github.com/huggingface/transformers/pull/37126
2,959,498,074
PR_kwDOCUB6oc6QuX7N
37,126
enable 2 llama UT cases on xpu
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T05:19:38
2025-04-08T00:22:03
2025-04-07T14:02:14
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37126", "html_url": "https://github.com/huggingface/transformers/pull/37126", "diff_url": "https://github.com/huggingface/transformers/pull/37126.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37126.patch", "merged_at": "2025-04-07T14:02:14" }
case 1: pytest -rA tests/models/llama/test_modeling_llama.py::LlamaIntegrationTest::test_model_7b_logits case 2: pytest -rA tests/models/llama/test_modeling_llama.py::LlamaIntegrationTest::test_model_7b_logits_bf16 both don't have XPU criteria, put them as key 0 and reuse A100/A100 ground truth. Both can pass in Ponte Vecchio XPU.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37126/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37126/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37125
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37125/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37125/comments
https://api.github.com/repos/huggingface/transformers/issues/37125/events
https://github.com/huggingface/transformers/issues/37125
2,959,462,062
I_kwDOCUB6oc6wZc6u
37,125
Possible to move HybridCache from GPU to CPU?
{ "login": "tianhaoz95", "id": 16887772, "node_id": "MDQ6VXNlcjE2ODg3Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/16887772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tianhaoz95", "html_url": "https://github.com/tianhaoz95", "followers_url": "https://api.github.com/users/tianhaoz95/followers", "following_url": "https://api.github.com/users/tianhaoz95/following{/other_user}", "gists_url": "https://api.github.com/users/tianhaoz95/gists{/gist_id}", "starred_url": "https://api.github.com/users/tianhaoz95/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tianhaoz95/subscriptions", "organizations_url": "https://api.github.com/users/tianhaoz95/orgs", "repos_url": "https://api.github.com/users/tianhaoz95/repos", "events_url": "https://api.github.com/users/tianhaoz95/events{/privacy}", "received_events_url": "https://api.github.com/users/tianhaoz95/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" }, { "id": 6735206706, "node_id": "LA_kwDOCUB6oc8AAAABkXMZMg", "url": "https://api.github.com/repos/huggingface/transformers/labels/Cache", "name": "Cache", "color": "1EA506", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-03-31T04:46:06
2025-04-03T22:14:32
2025-04-03T22:14:31
NONE
null
null
null
null
### Feature request I'm working on PD disaggregation where I'm trying to pass KV cache produced in GPU to CPU before sending them to decode process, looks like the following implementation doesn't have a way to move once created, is that the case? If so, is it possible to add an API to do that? https://github.com/huggingface/transformers/blob/0d6a60fe55fe051a1a68f2026d19223ed57b3c75/src/transformers/cache_utils.py#L1612 ### Motivation The motivation is that in different stages of the generation, we might want to utilize different device. ### Your contribution I'm happen to open a PR to add if confirmed missing and we would like something to do this.
{ "login": "tianhaoz95", "id": 16887772, "node_id": "MDQ6VXNlcjE2ODg3Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/16887772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tianhaoz95", "html_url": "https://github.com/tianhaoz95", "followers_url": "https://api.github.com/users/tianhaoz95/followers", "following_url": "https://api.github.com/users/tianhaoz95/following{/other_user}", "gists_url": "https://api.github.com/users/tianhaoz95/gists{/gist_id}", "starred_url": "https://api.github.com/users/tianhaoz95/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tianhaoz95/subscriptions", "organizations_url": "https://api.github.com/users/tianhaoz95/orgs", "repos_url": "https://api.github.com/users/tianhaoz95/repos", "events_url": "https://api.github.com/users/tianhaoz95/events{/privacy}", "received_events_url": "https://api.github.com/users/tianhaoz95/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37125/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37125/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37124
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37124/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37124/comments
https://api.github.com/repos/huggingface/transformers/issues/37124/events
https://github.com/huggingface/transformers/pull/37124
2,959,411,068
PR_kwDOCUB6oc6QuFAN
37,124
Make canine model exportable by removing unncessary complicated logic
{ "login": "tugsbayasgalan", "id": 16603271, "node_id": "MDQ6VXNlcjE2NjAzMjcx", "avatar_url": "https://avatars.githubusercontent.com/u/16603271?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tugsbayasgalan", "html_url": "https://github.com/tugsbayasgalan", "followers_url": "https://api.github.com/users/tugsbayasgalan/followers", "following_url": "https://api.github.com/users/tugsbayasgalan/following{/other_user}", "gists_url": "https://api.github.com/users/tugsbayasgalan/gists{/gist_id}", "starred_url": "https://api.github.com/users/tugsbayasgalan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tugsbayasgalan/subscriptions", "organizations_url": "https://api.github.com/users/tugsbayasgalan/orgs", "repos_url": "https://api.github.com/users/tugsbayasgalan/repos", "events_url": "https://api.github.com/users/tugsbayasgalan/events{/privacy}", "received_events_url": "https://api.github.com/users/tugsbayasgalan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T04:01:36
2025-04-01T11:31:13
2025-04-01T11:31:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37124", "html_url": "https://github.com/huggingface/transformers/pull/37124", "diff_url": "https://github.com/huggingface/transformers/pull/37124.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37124.patch", "merged_at": "2025-04-01T11:31:13" }
# What does this PR do? In this code, char_seq_length is guaranteed to be an integer, so there is no need to cast it to tensor. By removing this, canine model is now exportable. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37124/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37124/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37123
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37123/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37123/comments
https://api.github.com/repos/huggingface/transformers/issues/37123/events
https://github.com/huggingface/transformers/pull/37123
2,959,403,727
PR_kwDOCUB6oc6QuDam
37,123
Correctly drop tokens in SwitchTransformer
{ "login": "mario-aws", "id": 172859788, "node_id": "U_kgDOCk2hjA", "avatar_url": "https://avatars.githubusercontent.com/u/172859788?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mario-aws", "html_url": "https://github.com/mario-aws", "followers_url": "https://api.github.com/users/mario-aws/followers", "following_url": "https://api.github.com/users/mario-aws/following{/other_user}", "gists_url": "https://api.github.com/users/mario-aws/gists{/gist_id}", "starred_url": "https://api.github.com/users/mario-aws/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mario-aws/subscriptions", "organizations_url": "https://api.github.com/users/mario-aws/orgs", "repos_url": "https://api.github.com/users/mario-aws/repos", "events_url": "https://api.github.com/users/mario-aws/events{/privacy}", "received_events_url": "https://api.github.com/users/mario-aws/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T03:54:28
2025-04-10T14:58:58
2025-04-10T14:58:58
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37123", "html_url": "https://github.com/huggingface/transformers/pull/37123", "diff_url": "https://github.com/huggingface/transformers/pull/37123.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37123.patch", "merged_at": "2025-04-10T14:58:58" }
# What does this PR do? Previously, the identity function was used for dropped tokens with a weight from the expert that was not applied to the hidden states. This was misleading, because dropping means, the expert weight is zero. Instead of trying to fix the weight, we take an easier approach by initializing with zeros. Fixes #37017 ## Related work https://github.com/tensorflow/mesh/blob/e6798a2610a2c2f4c4cd236d8214422cb1ecc00a/mesh_tensorflow/transformer/moe.py#L1144 mentions that it needs to be zeroed out. https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/moe.py#L507C18-L507C31 combines the results without any clone initialization beforehand. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? text models: @ArthurZucker last major changing person: @zucchini-nlp person who requested PR: @Rocketknight1
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37123/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37123/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37122
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37122/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37122/comments
https://api.github.com/repos/huggingface/transformers/issues/37122/events
https://github.com/huggingface/transformers/issues/37122
2,959,388,719
I_kwDOCUB6oc6wZLAv
37,122
Bug in Phi4 processor
{ "login": "insujang", "id": 4559373, "node_id": "MDQ6VXNlcjQ1NTkzNzM=", "avatar_url": "https://avatars.githubusercontent.com/u/4559373?v=4", "gravatar_id": "", "url": "https://api.github.com/users/insujang", "html_url": "https://github.com/insujang", "followers_url": "https://api.github.com/users/insujang/followers", "following_url": "https://api.github.com/users/insujang/following{/other_user}", "gists_url": "https://api.github.com/users/insujang/gists{/gist_id}", "starred_url": "https://api.github.com/users/insujang/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/insujang/subscriptions", "organizations_url": "https://api.github.com/users/insujang/orgs", "repos_url": "https://api.github.com/users/insujang/repos", "events_url": "https://api.github.com/users/insujang/events{/privacy}", "received_events_url": "https://api.github.com/users/insujang/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-31T03:38:22
2025-04-15T13:52:16
2025-04-15T13:52:16
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.51.0.dev0 (Commit 0d6a60f) - Platform: Linux-5.14.0-503.22.1.el9_5.x86_64-x86_64-with-glibc2.35 - Python version: 3.11.11 - Huggingface_hub version: 0.29.3 - Safetensors version: 0.5.3 - Accelerate version: 1.5.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (GPU?): 2.6.0+cu126 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed ### Who can help? @Cyrilvallez https://github.com/huggingface/transformers/blob/0d6a60fe55fe051a1a68f2026d19223ed57b3c75/src/transformers/models/phi4_multimodal/processing_phi4_multimodal.py#L135-L136 How does `Phi4MultimodalProcessor` configure the `image_token` and `audio_token` into the tokenizer? The lines above try to retrieve the token ids from the tokenizer which I can't find where to be initialized. As a result, when I run the following code, the processor cannot properly generates an input: ```python processor = Phi4MultimodalProcessor.from_pretrained("microsoft/Phi-4-multimodal-instruct") image = Image.fromarray(generate_random_image(resolution=(720, 480))) inputs = processor(text="<|image_1|>", images=image, return_tensors="pt").to(dtype=torch.bfloat16, device="cuda") ``` ``` image = Image.fromarray(generate_random_image(resolution=(720, 480))) ---> inputs = processor(text="<|audio_1|>", audios=audio, return_tensors="pt").to(dtype=torch.bfloat16, device="cuda") inputs["labels"] = inputs["input_ids"].clone() outputs = model(**inputs) File /opt/conda/lib/python3.11/site-packages/transformers/models/phi4_multimodal/processing_phi4_multimodal.py:135, in Phi4MultimodalProcessor.__call__(self, text, images, audios, **kwargs) elif not isinstance(text, list) and not isinstance(text[0], str): raise ValueError("Invalid input text. Please provide a string, or a list of strings") --> image_token = self.tokenizer.image_token audio_token = self.tokenizer.audio_token processed_text = [re.sub(self.fake_image_token_pattern, image_token, t) for t in text] File /opt/conda/lib/python3.11/site-packages/transformers/tokenization_utils_base.py:1108, in SpecialTokensMixin.__getattr__(self, key) return self.convert_tokens_to_ids(attr_as_tokens) if attr_as_tokens is not None else None if key not in self.__dict__: -> raise AttributeError(f"{self.__class__.__name__} has no attribute {key}") else: return super().__getattr__(key) AttributeError: GPT2TokenizerFast has no attribute image_token ``` ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction - Use the following function to generate a fake image: ```python import numpy as np from PIL import Image def generate_random_image(resolution:tuple[int, int]): width, height = resolution image = Image.fromarray(np.random.randint(0, 256, size=(height, width, 3), dtype=np.uint8)) return image ``` - Call processor ```python processor = Phi4MultimodalProcessor.from_pretrained("microsoft/Phi-4-multimodal-instruct") image = generate_random_image(resolution=(720, 480)) inputs = processor(text="<|image_1|>", images=image, return_tensors="pt").to(dtype=torch.bfloat16, device="cuda") ``` ### Expected behavior Return an output with `input_ids`, `attention_mask`, `pixel_values`, etc, without an error.
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37122/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37122/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37121
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37121/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37121/comments
https://api.github.com/repos/huggingface/transformers/issues/37121/events
https://github.com/huggingface/transformers/pull/37121
2,959,364,914
PR_kwDOCUB6oc6Qt7Ey
37,121
fix XPU UT error case brough by RNG difference btw XPU and CUDA
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T03:13:47
2025-04-01T23:41:16
2025-04-01T12:52:55
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37121", "html_url": "https://github.com/huggingface/transformers/pull/37121", "diff_url": "https://github.com/huggingface/transformers/pull/37121.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37121.patch", "merged_at": "2025-04-01T12:52:55" }
**Symptom** `pytest -rA tests/generation/test_logits_process.py::LogitsProcessorTest::test_watermarking_processor` will assert out with below logs while running on XPU. > elf.assertTrue((out[:, 1] == scores_wo_bias + watermark.bias).all()) > E AssertionError: tensor(False, device='xpu:0') is not true **Root cause** `WatermarkLogitsProcessor` uses device RNG `self.rng = torch.Generator(device=device)` to generate random numbers, this leads to different generated random number even with same seed, since RNG is device-dependent. In this case, CUDA's generated greenlist_ids is: `greenlist_ids: tensor([19, 10, 8, 9, 1], device='cuda:0')` while XPU's greenlist_ids is `greenlist_ids: tensor([10, 17, 5, 4, 3], device='xpu:0')`. But the UT only check id 0, which leads to failure on XPU **How to Fix** set to-be-checked `greenlist_id` on XPU to 3, as `greenlist_id = 3 if torch_device == "xpu" else 1`.
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37121/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37121/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37120
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37120/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37120/comments
https://api.github.com/repos/huggingface/transformers/issues/37120/events
https://github.com/huggingface/transformers/pull/37120
2,959,276,217
PR_kwDOCUB6oc6QtoUI
37,120
enable test_assisted_decoding_in_different_gpu UT on XPU
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-31T01:46:46
2025-04-01T23:39:26
2025-04-01T09:22:59
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37120", "html_url": "https://github.com/huggingface/transformers/pull/37120", "diff_url": "https://github.com/huggingface/transformers/pull/37120.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37120.patch", "merged_at": "2025-04-01T09:22:59" }
pytest -rA tests/generation/test_utils.py::GenerationIntegrationTests::test_assisted_decoding_in_different_gpu **passed**
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37120/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37120/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37119
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37119/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37119/comments
https://api.github.com/repos/huggingface/transformers/issues/37119/events
https://github.com/huggingface/transformers/pull/37119
2,959,202,318
PR_kwDOCUB6oc6QtY0x
37,119
Add FastImageProcessor for EfficientNet
{ "login": "chewyuenrachael", "id": 115143647, "node_id": "U_kgDOBtzz3w", "avatar_url": "https://avatars.githubusercontent.com/u/115143647?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chewyuenrachael", "html_url": "https://github.com/chewyuenrachael", "followers_url": "https://api.github.com/users/chewyuenrachael/followers", "following_url": "https://api.github.com/users/chewyuenrachael/following{/other_user}", "gists_url": "https://api.github.com/users/chewyuenrachael/gists{/gist_id}", "starred_url": "https://api.github.com/users/chewyuenrachael/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chewyuenrachael/subscriptions", "organizations_url": "https://api.github.com/users/chewyuenrachael/orgs", "repos_url": "https://api.github.com/users/chewyuenrachael/repos", "events_url": "https://api.github.com/users/chewyuenrachael/events{/privacy}", "received_events_url": "https://api.github.com/users/chewyuenrachael/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-03-31T00:26:31
2025-04-16T17:50:45
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37119", "html_url": "https://github.com/huggingface/transformers/pull/37119", "diff_url": "https://github.com/huggingface/transformers/pull/37119.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37119.patch", "merged_at": null }
# Add FastImageProcessor for EfficientNet ## What does this PR do? This PR implements a `FastImageProcessor` for the `EfficientNet` model using `BaseImageProcessorFast`. - Added `EfficientNetImageProcessorFast` to support fast GPU-based preprocessing using PyTorch/Torchvision ops - Verified default values (e.g., size, crop_size, normalization) match the existing slow processor - Cleaned out unused attributes (`default_to_square`, `do_convert_rgb`) after comparing both implementations - Updated the test file `test_image_processing_efficientnet.py` to support both fast and slow processors using `image_processor_list` pattern - Verified that all relevant tests pass with `RUN_SLOW=1` This PR contributes to [[#36978](https://github.com/huggingface/transformers/issues/36978)](https://github.com/huggingface/transformers/issues/36978) ## Before submitting - [x] I have read the [[contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request)](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request) - [x] This PR was initiated via a GitHub issue: #36978 - [x] I have matched default attributes between the slow and fast image processors - [x] I have updated test coverage to loop over both slow and fast processors - [x] I have verified all tests pass with `RUN_SLOW=1` - [x] I ran `black` and `make fixup` to format the code - [ ] Documentation did not need updating (no new public methods introduced) - [x] All new methods follow the docstring conventions ## Who can review? This contribution affects **vision models**. Tagging: @qubvel @amyeroberts for review. Thanks in advance! 🤗
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37119/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37119/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/37118
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37118/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37118/comments
https://api.github.com/repos/huggingface/transformers/issues/37118/events
https://github.com/huggingface/transformers/issues/37118
2,959,199,438
I_kwDOCUB6oc6wYczO
37,118
FastAPI with LLM inference does not release accumulated VRAM
{ "login": "variable", "id": 558175, "node_id": "MDQ6VXNlcjU1ODE3NQ==", "avatar_url": "https://avatars.githubusercontent.com/u/558175?v=4", "gravatar_id": "", "url": "https://api.github.com/users/variable", "html_url": "https://github.com/variable", "followers_url": "https://api.github.com/users/variable/followers", "following_url": "https://api.github.com/users/variable/following{/other_user}", "gists_url": "https://api.github.com/users/variable/gists{/gist_id}", "starred_url": "https://api.github.com/users/variable/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/variable/subscriptions", "organizations_url": "https://api.github.com/users/variable/orgs", "repos_url": "https://api.github.com/users/variable/repos", "events_url": "https://api.github.com/users/variable/events{/privacy}", "received_events_url": "https://api.github.com/users/variable/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-31T00:23:11
2025-04-02T13:11:03
2025-04-02T13:11:02
NONE
null
null
null
null
this is my fastapi snippet ``` model_name = "meta-llama/Llama-3.2-1B-Instruct" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto", torch_dtype=torch.float16) @app.post("/unfluff") async def unfluff_llm(unfluff_request: UnfluffRequest): messages = [ {"role": "system", "content": "Summarise the given service order ticket content, answer should include action performed by helpdesk, don't include the meta info of the ticket such as status, type and priority, don't include when techician is added. Answer directly without any introductory phrases, just answer should include all key points in a concise way"}, {"role": "user", "content": unfluff_request.text} ] formatted_prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) inputs = tokenizer(formatted_prompt, return_tensors="pt").to("cuda") with torch.no_grad(): output = model.generate(**inputs, temperature=0.5, top_p=0.9, max_new_tokens=150) # trim the user input from response new_tokens = output[0][inputs['input_ids'].shape[-1]:] response = tokenizer.decode(new_tokens, skip_special_tokens=True) return {'result': response} ``` Initially it started with 3GB (along with other unrelated things), after the LLM endpoint gets called, the VRAM started to climb, but it peaked at about 22GB-ish then OOM, given the video card only has 24GB. If no further calls to the endpoint the VRAM also does not get released. ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction keep calling the endpoint that serves the LLM query. ### Expected behavior Expected the VRAM would not increase overtime.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37118/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37118/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37117
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37117/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37117/comments
https://api.github.com/repos/huggingface/transformers/issues/37117/events
https://github.com/huggingface/transformers/issues/37117
2,958,999,992
I_kwDOCUB6oc6wXsG4
37,117
Paligemma connector details
{ "login": "ytwang13", "id": 154277968, "node_id": "U_kgDOCTIYUA", "avatar_url": "https://avatars.githubusercontent.com/u/154277968?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ytwang13", "html_url": "https://github.com/ytwang13", "followers_url": "https://api.github.com/users/ytwang13/followers", "following_url": "https://api.github.com/users/ytwang13/following{/other_user}", "gists_url": "https://api.github.com/users/ytwang13/gists{/gist_id}", "starred_url": "https://api.github.com/users/ytwang13/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ytwang13/subscriptions", "organizations_url": "https://api.github.com/users/ytwang13/orgs", "repos_url": "https://api.github.com/users/ytwang13/repos", "events_url": "https://api.github.com/users/ytwang13/events{/privacy}", "received_events_url": "https://api.github.com/users/ytwang13/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-30T17:34:04
2025-03-31T09:00:19
2025-03-31T09:00:18
NONE
null
null
null
null
https://github.com/huggingface/transformers/blob/348f3285c5114159d2ff4933b4b8ae36866d01a7/src/transformers/models/paligemma/modeling_paligemma.py#L421 Here I noticed a normalization over sqrt(text_hidden_dim), why is this? And I do not see this line in official big_vision implementation.
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37117/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37117/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37116
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37116/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37116/comments
https://api.github.com/repos/huggingface/transformers/issues/37116/events
https://github.com/huggingface/transformers/issues/37116
2,958,972,148
I_kwDOCUB6oc6wXlT0
37,116
modernBERT Duplicate Template Name
{ "login": "kpdowney", "id": 82908087, "node_id": "MDQ6VXNlcjgyOTA4MDg3", "avatar_url": "https://avatars.githubusercontent.com/u/82908087?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kpdowney", "html_url": "https://github.com/kpdowney", "followers_url": "https://api.github.com/users/kpdowney/followers", "following_url": "https://api.github.com/users/kpdowney/following{/other_user}", "gists_url": "https://api.github.com/users/kpdowney/gists{/gist_id}", "starred_url": "https://api.github.com/users/kpdowney/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kpdowney/subscriptions", "organizations_url": "https://api.github.com/users/kpdowney/orgs", "repos_url": "https://api.github.com/users/kpdowney/repos", "events_url": "https://api.github.com/users/kpdowney/events{/privacy}", "received_events_url": "https://api.github.com/users/kpdowney/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-03-30T16:41:54
2025-04-30T12:09:34
2025-04-30T12:09:33
NONE
null
null
null
null
### System Info transformers 4.50.3 torch 2.6.0 python 3.11 MacOS 15.3.2 Trying to run a simple test in a notebook I get the following error: RuntimeError: Failed to import transformers.models.modernbert.modeling_modernbert because of the following error (look up to see its traceback): duplicate template name I have completely uninstalled transformers and re-installed the latest version. I even tried to compile from git on the 4.51.0dev version to see if that resolved the issue. I also updated and reinstalled torch. Note that I tested this on both the base modernbert model and this simple code: from transformers import pipeline classifier = pipeline("zero-shot-classification",model="tasksource/ModernBERT-base-nli") The exact same result each time. Any assistance would be appreciated. Here is the full trace: config.json: 0%| | 0.00/5.54k [00:00<?, ?B/s] --------------------------------------------------------------------------- AssertionError Traceback (most recent call last) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/utils/import_utils.py:1976, in _LazyModule._get_module(self, module_name)  1975 try: -> 1976 return importlib.import_module("." + module_name, self.__name__)  1977 except Exception as e: File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/importlib/__init__.py:126, in import_module(name, package)  125 level += 1 --> 126 return _bootstrap._gcd_import(name[level:], package, level) File <frozen importlib._bootstrap>:1206, in _gcd_import(name, package, level) File <frozen importlib._bootstrap>:1178, in _find_and_load(name, import_) File <frozen importlib._bootstrap>:1149, in _find_and_load_unlocked(name, import_) File <frozen importlib._bootstrap>:690, in _load_unlocked(spec) File <frozen importlib._bootstrap_external>:940, in exec_module(self, module) File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/models/modernbert/modeling_modernbert.py:200  197 return f"dim={self.dim}, base={self.base}, scale_base={self.scale_base}" --> 200 class ModernBertEmbeddings(nn.Module):  201  """  202  Same as BertEmbeddings with a tiny tweak for positional embeddings indexing.  203  """ File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/models/modernbert/modeling_modernbert.py:212, in ModernBertEmbeddings()  210 self.drop = nn.Dropout(config.embedding_dropout) --> 212 @torch.compile(dynamic=True)  213 def compiled_embeddings(self, input_ids: torch.LongTensor) -> torch.Tensor:  214  return self.drop(self.norm(self.tok_embeddings(input_ids))) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/__init__.py:2536, in compile.<locals>.fn(model)  2535 raise RuntimeError("Model can't be None") -> 2536 return compile(  2537  model,  2538  fullgraph=fullgraph,  2539  dynamic=dynamic,  2540  backend=backend,  2541  mode=mode,  2542  options=options,  2543  disable=disable,  2544 ) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/__init__.py:2565, in compile(model, fullgraph, dynamic, backend, mode, options, disable)  2563 backend = _TorchCompileWrapper(backend, mode, options, dynamic) -> 2565 return torch._dynamo.optimize(  2566  backend=backend,  2567  nopython=fullgraph,  2568  dynamic=dynamic,  2569  disable=disable,  2570 )(model) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py:842, in optimize(*args, **kwargs)  840 return optimize(*args, **kwargs) --> 842 return _optimize(rebuild_ctx, *args, **kwargs) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py:917, in _optimize(rebuild_ctx, backend, nopython, guard_export_fn, guard_fail_fn, disable, dynamic)  908 # The backend function is stashed in the callable returned by  909 # _optimize_catch_errors in the field _torchdynamo_orig_callable. This can  910 # be used by eval_frame.c to insert a guard on the backend.  911 return _optimize_catch_errors(  912 convert_frame.convert_frame(backend, hooks=hooks),  913 hooks,  914 backend_ctx_ctor,  915 dynamic=dynamic,  916 compiler_config=( --> 917 backend.get_compiler_config()  918 if hasattr(backend, "get_compiler_config")  919 else None  920 ),  921 rebuild_ctx=rebuild_ctx,  922 ) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/__init__.py:2343, in _TorchCompileInductorWrapper.get_compiler_config(self)  2342 def get_compiler_config(self): -> 2343 from torch._inductor.compile_fx import get_patched_config_dict  2345 return get_patched_config_dict(config_patches=self.config) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py:97  96 from .decomposition import select_decomp_table ---> 97 from .fx_passes.joint_graph import joint_graph_passes  98 from .fx_passes.post_grad import post_grad_passes, view_to_reshape File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/fx_passes/joint_graph.py:22  21 from .. import config ---> 22 from ..pattern_matcher import (  23 CallFunction,  24 init_once_fakemode,  25 KeywordArg,  26 Match,  27 MULTIPLE,  28 PatternMatcherPass,  29 register_graph_pattern,  30 stable_topological_sort,  31 )  32 from .replace_random import replace_random_passes File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/pattern_matcher.py:95  94 from .decomposition import select_decomp_table ---> 95 from .lowering import fallback_node_due_to_unsupported_type  98 log = logging.getLogger(__name__) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/lowering.py:6518  6515 from . import kernel -> 6518 import_submodule(kernel)  6520 from . import quantized_lowerings File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_dynamo/utils.py:2691, in import_submodule(mod)  2690 if filename.endswith(".py") and filename[0] != "_": -> 2691 importlib.import_module(f"{mod.__name__}.{filename[:-3]}") File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/importlib/__init__.py:126, in import_module(name, package)  125 level += 1 --> 126 return _bootstrap._gcd_import(name[level:], package, level) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/kernel/bmm.py:52  49 return mm_configs(m, n, k) ---> 52 bmm_template = TritonTemplate(  53  name="bmm",  54  grid=bmm_grid,  55  source=r"""  56 {{def_kernel("A", "B")}}  57  M = {{size("A", -2)}}  58  N = {{size("B", -1)}}  59  K = {{size("A", -1)}}  60  61  stride_aq = {{stride("A", 0)}}  62  stride_am = {{stride("A", 1)}}  63  stride_ak = {{stride("A", 2)}}  64  65  stride_bq = {{stride("B", 0)}}  66  stride_bk = {{stride("B", 1)}}  67  stride_bn = {{stride("B", 2)}}  68  69  # based on triton.ops.matmul  70  pid = tl.program_id(0)  71  grid_m = (M + BLOCK_M - 1) // BLOCK_M  72  grid_n = (N + BLOCK_N - 1) // BLOCK_N  73  74  # re-order program ID for better L2 performance  75  width = GROUP_M * grid_n  76  group_id = pid // width  77  group_size = min(grid_m - group_id * GROUP_M, GROUP_M)  78  pid_m = group_id * GROUP_M + (pid % group_size)  79  pid_n = (pid % width) // (group_size)  80  81  rm = pid_m * BLOCK_M + tl.arange(0, BLOCK_M)  82  rn = pid_n * BLOCK_N + tl.arange(0, BLOCK_N)  83  if (stride_am == 1 and stride_ak == M) or (stride_am == K and stride_ak == 1):  84  ram = tl.max_contiguous(tl.multiple_of(rm % M, BLOCK_M), BLOCK_M)  85  else:  86  ram = rm % M  87  if (stride_bk == 1 and stride_bn == K) or (stride_bk == N and stride_bn == 1):  88  rbn = tl.max_contiguous(tl.multiple_of(rn % N, BLOCK_N), BLOCK_N)  89  else:  90  rbn = rn % N  91  92  rk = tl.arange(0, BLOCK_K)  93  94  idx_q = tl.program_id(1) # batch dimension for BMM  95  A = A + (ram[:, None] * stride_am + rk[None, :] * stride_ak + idx_q*stride_aq)  96  B = B + (rk[:, None] * stride_bk + rbn[None, :] * stride_bn + idx_q*stride_bq)  97  98  acc = tl.zeros((BLOCK_M, BLOCK_N), dtype=ACC_TYPE)  99  for k in range(K, 0, -BLOCK_K):  100  if EVEN_K:  101  a = tl.load(A)  102  b = tl.load(B)  103  else:  104  a = tl.load(A, mask=rk[None, :] < k, other=0.)  105  b = tl.load(B, mask=rk[:, None] < k, other=0.)  106  acc += tl.dot(a, b, allow_tf32=ALLOW_TF32)  107  A += BLOCK_K * stride_ak  108  B += BLOCK_K * stride_bk  109  110  # rematerialize rm and rn to save registers  111  rm = pid_m * BLOCK_M + tl.arange(0, BLOCK_M)  112  rn = pid_n * BLOCK_N + tl.arange(0, BLOCK_N)  113  idx_q = tl.program_id(1) # batch dimension for BMM  114  idx_m = rm[:, None]  115  idx_n = rn[None, :]  116  mask = (idx_m < M) & (idx_n < N)  117  118  # inductor generates a suffix  119  {{store_output(("idx_q", "idx_m", "idx_n"), "acc", "mask")}}  120 """,  121 )  123 aten_bmm = ExternKernelChoice(torch.bmm, "at::bmm_out") File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/torch/_inductor/select_algorithm.py:767, in TritonTemplate.__init__(self, name, grid, source, debug)  766 self.template = self._template_from_string(source) --> 767 assert name not in self.all_templates, "duplicate template name"  768 self.all_templates[name] = self AssertionError: duplicate template name The above exception was the direct cause of the following exception: RuntimeError Traceback (most recent call last) Cell In [2], line 4  2 import torch  3 from transformers import pipeline ----> 4 classifier = pipeline("zero-shot-classification",model="tasksource/ModernBERT-base-nli") File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/pipelines/__init__.py:942, in pipeline(task, model, config, tokenizer, feature_extractor, image_processor, processor, framework, revision, use_fast, token, device, device_map, torch_dtype, trust_remote_code, model_kwargs, pipeline_class, **kwargs)  940 if isinstance(model, str) or framework is None:  941 model_classes = {"tf": targeted_task["tf"], "pt": targeted_task["pt"]} --> 942 framework, model = infer_framework_load_model(  943  adapter_path if adapter_path is not None else model,  944  model_classes=model_classes,  945  config=config,  946  framework=framework,  947  task=task,  948  **hub_kwargs,  949  **model_kwargs,  950  )  952 model_config = model.config  953 hub_kwargs["_commit_hash"] = model.config._commit_hash File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/pipelines/base.py:262, in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)  260 transformers_module = importlib.import_module("transformers")  261 if look_pt: --> 262 _class = getattr(transformers_module, architecture, None)  263 if _class is not None:  264 classes.append(_class) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/utils/import_utils.py:1965, in _LazyModule.__getattr__(self, name)  1963 elif name in self._class_to_module.keys():  1964 module = self._get_module(self._class_to_module[name]) -> 1965 value = getattr(module, name)  1966 elif name in self._modules:  1967 value = self._get_module(name) File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/utils/import_utils.py:1964, in _LazyModule.__getattr__(self, name)  1962 value = Placeholder  1963 elif name in self._class_to_module.keys(): -> 1964 module = self._get_module(self._class_to_module[name])  1965 value = getattr(module, name)  1966 elif name in self._modules: File ~/Documents/Companies/Sentinel/Codebase/Python/2023_Code/venv/lib/python3.11/site-packages/transformers/utils/import_utils.py:1978, in _LazyModule._get_module(self, module_name)  1976 return importlib.import_module("." + module_name, self.__name__)  1977 except Exception as e: -> 1978 raise RuntimeError(  1979 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"  1980 f" traceback):\n{e}"  1981 ) from e RuntimeError: Failed to import transformers.models.modernbert.modeling_modernbert because of the following error (look up to see its traceback): duplicate template name ### Who can help? _No response_ ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction To reproduce this I simply did: from transformers import pipeline classifier = pipeline("zero-shot-classification",model="tasksource/ModernBERT-base-nli") ### Expected behavior The model would download from source.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37116/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37116/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/37115
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37115/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37115/comments
https://api.github.com/repos/huggingface/transformers/issues/37115/events
https://github.com/huggingface/transformers/pull/37115
2,958,861,221
PR_kwDOCUB6oc6QsWOs
37,115
chore: Update model doc for code_llama
{ "login": "AbhishekRP2002", "id": 86261428, "node_id": "MDQ6VXNlcjg2MjYxNDI4", "avatar_url": "https://avatars.githubusercontent.com/u/86261428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AbhishekRP2002", "html_url": "https://github.com/AbhishekRP2002", "followers_url": "https://api.github.com/users/AbhishekRP2002/followers", "following_url": "https://api.github.com/users/AbhishekRP2002/following{/other_user}", "gists_url": "https://api.github.com/users/AbhishekRP2002/gists{/gist_id}", "starred_url": "https://api.github.com/users/AbhishekRP2002/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AbhishekRP2002/subscriptions", "organizations_url": "https://api.github.com/users/AbhishekRP2002/orgs", "repos_url": "https://api.github.com/users/AbhishekRP2002/repos", "events_url": "https://api.github.com/users/AbhishekRP2002/events{/privacy}", "received_events_url": "https://api.github.com/users/AbhishekRP2002/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-30T12:56:11
2025-04-03T17:09:41
2025-04-03T17:09:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37115", "html_url": "https://github.com/huggingface/transformers/pull/37115", "diff_url": "https://github.com/huggingface/transformers/pull/37115.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37115.patch", "merged_at": "2025-04-03T17:09:41" }
# What does this PR do? aims to handle https://github.com/huggingface/transformers/issues/36979#issuecomment-2758560598 sub part of https://github.com/huggingface/transformers/issues/36979 <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> ([36979](https://github.com/huggingface/transformers/issues/36979)) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @stevhliu <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface and @SunMarc - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37115/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37115/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37114
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37114/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37114/comments
https://api.github.com/repos/huggingface/transformers/issues/37114/events
https://github.com/huggingface/transformers/pull/37114
2,958,854,190
PR_kwDOCUB6oc6QsU3D
37,114
RWKV: fix mask warning typo
{ "login": "RobinKa", "id": 2614101, "node_id": "MDQ6VXNlcjI2MTQxMDE=", "avatar_url": "https://avatars.githubusercontent.com/u/2614101?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RobinKa", "html_url": "https://github.com/RobinKa", "followers_url": "https://api.github.com/users/RobinKa/followers", "following_url": "https://api.github.com/users/RobinKa/following{/other_user}", "gists_url": "https://api.github.com/users/RobinKa/gists{/gist_id}", "starred_url": "https://api.github.com/users/RobinKa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RobinKa/subscriptions", "organizations_url": "https://api.github.com/users/RobinKa/orgs", "repos_url": "https://api.github.com/users/RobinKa/repos", "events_url": "https://api.github.com/users/RobinKa/events{/privacy}", "received_events_url": "https://api.github.com/users/RobinKa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-30T12:40:18
2025-03-31T09:07:51
2025-03-31T09:07:51
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37114", "html_url": "https://github.com/huggingface/transformers/pull/37114", "diff_url": "https://github.com/huggingface/transformers/pull/37114.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37114.patch", "merged_at": "2025-03-31T09:07:51" }
`RwKvModel.forward` is supposed to warn if `attention_mask` is passed. Right now the check is `attention_mask is None` which I assume was supposed to be `attention_mask is not None`, which this PR fixes. @gante
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37114/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37114/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37113
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37113/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37113/comments
https://api.github.com/repos/huggingface/transformers/issues/37113/events
https://github.com/huggingface/transformers/pull/37113
2,958,776,006
PR_kwDOCUB6oc6QsFUD
37,113
Add Fast Mobilenet-V2 Processor
{ "login": "keetrap", "id": 103131112, "node_id": "U_kgDOBiWn6A", "avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keetrap", "html_url": "https://github.com/keetrap", "followers_url": "https://api.github.com/users/keetrap/followers", "following_url": "https://api.github.com/users/keetrap/following{/other_user}", "gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}", "starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keetrap/subscriptions", "organizations_url": "https://api.github.com/users/keetrap/orgs", "repos_url": "https://api.github.com/users/keetrap/repos", "events_url": "https://api.github.com/users/keetrap/events{/privacy}", "received_events_url": "https://api.github.com/users/keetrap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-30T09:52:56
2025-04-14T15:08:48
2025-04-14T15:08:47
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37113", "html_url": "https://github.com/huggingface/transformers/pull/37113", "diff_url": "https://github.com/huggingface/transformers/pull/37113.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37113.patch", "merged_at": "2025-04-14T15:08:47" }
Related #36978 cc @yonigozlan
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37113/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37113/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/37112
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/37112/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/37112/comments
https://api.github.com/repos/huggingface/transformers/issues/37112/events
https://github.com/huggingface/transformers/pull/37112
2,958,718,081
PR_kwDOCUB6oc6Qr6Gl
37,112
Add Fast Image Processors for mobileViT
{ "login": "MinJu-Ha", "id": 101788861, "node_id": "U_kgDOBhEsvQ", "avatar_url": "https://avatars.githubusercontent.com/u/101788861?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MinJu-Ha", "html_url": "https://github.com/MinJu-Ha", "followers_url": "https://api.github.com/users/MinJu-Ha/followers", "following_url": "https://api.github.com/users/MinJu-Ha/following{/other_user}", "gists_url": "https://api.github.com/users/MinJu-Ha/gists{/gist_id}", "starred_url": "https://api.github.com/users/MinJu-Ha/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MinJu-Ha/subscriptions", "organizations_url": "https://api.github.com/users/MinJu-Ha/orgs", "repos_url": "https://api.github.com/users/MinJu-Ha/repos", "events_url": "https://api.github.com/users/MinJu-Ha/events{/privacy}", "received_events_url": "https://api.github.com/users/MinJu-Ha/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-03-30T07:31:58
2025-03-31T13:43:43
2025-03-31T13:37:29
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/37112", "html_url": "https://github.com/huggingface/transformers/pull/37112", "diff_url": "https://github.com/huggingface/transformers/pull/37112.diff", "patch_url": "https://github.com/huggingface/transformers/pull/37112.patch", "merged_at": null }
related to #36978 cc: @yonigozlan I added Fast image processor for mobileViT and I noticed a noticeable difference between the outputs after preprocessing. Here’s the code I used to compare them: ``` diff = (encoding_slow.pixel_values - encoding_fast.pixel_values).abs() print(f"\n📊 Difference statistics:") print(f" Max difference: {diff.max().item():.10f}") print(f" Mean difference: {diff.mean().item():.10f}") print(f" Slow min/max: {encoding_slow.pixel_values.min().item():.10f} ~ {encoding_slow.pixel_values.max().item():.10f}") print(f" Fast min/max: {encoding_fast.pixel_values.min().item():.10f} ~ {encoding_fast.pixel_values.max().item():.10f}") print(f"Slow implementation dtype: {encoding_slow.pixel_values.dtype}") print(f"Fast implementation dtype: {encoding_fast.pixel_values.dtype}") ``` results: ``` 📊 Difference statistics: Max difference: 0.3411765397 Mean difference: 0.1117687449 Slow min/max: 0.0313725509 ~ 0.9764705896 Fast min/max: 0.0313725509 ~ 0.9764706492 Slow implementation dtype: torch.float32 Fast implementation dtype: torch.float32 ``` Even though the size configs look the same ({'shortest_edge': 20}), and both use torch.float32, the output difference seems quite significant for a slow/fast equivalence test.
{ "login": "MinJu-Ha", "id": 101788861, "node_id": "U_kgDOBhEsvQ", "avatar_url": "https://avatars.githubusercontent.com/u/101788861?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MinJu-Ha", "html_url": "https://github.com/MinJu-Ha", "followers_url": "https://api.github.com/users/MinJu-Ha/followers", "following_url": "https://api.github.com/users/MinJu-Ha/following{/other_user}", "gists_url": "https://api.github.com/users/MinJu-Ha/gists{/gist_id}", "starred_url": "https://api.github.com/users/MinJu-Ha/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MinJu-Ha/subscriptions", "organizations_url": "https://api.github.com/users/MinJu-Ha/orgs", "repos_url": "https://api.github.com/users/MinJu-Ha/repos", "events_url": "https://api.github.com/users/MinJu-Ha/events{/privacy}", "received_events_url": "https://api.github.com/users/MinJu-Ha/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/37112/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/37112/timeline
null
null
null
null
true
true