url
string
repository_url
string
labels_url
string
comments_url
string
events_url
string
html_url
string
id
int64
node_id
string
number
int64
title
string
user
dict
labels
list
state
string
locked
bool
assignee
dict
assignees
list
milestone
null
comments
list
created_at
timestamp[ms]
updated_at
timestamp[ms]
closed_at
timestamp[ms]
author_association
string
type
dict
active_lock_reason
null
draft
bool
pull_request
dict
body
string
closed_by
dict
reactions
dict
timeline_url
string
performed_via_github_app
null
state_reason
string
sub_issues_summary
dict
issue_dependencies_summary
dict
is_pull_request
bool
is_closed
bool
https://api.github.com/repos/huggingface/transformers/issues/41643
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41643/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41643/comments
https://api.github.com/repos/huggingface/transformers/issues/41643/events
https://github.com/huggingface/transformers/issues/41643
3,520,608,507
I_kwDOCUB6oc7R2Dj7
41,643
Qwen3VL input_ids = input_ids[attention_mask[i] == 1]
{ "login": "bent1e", "id": 50535454, "node_id": "MDQ6VXNlcjUwNTM1NDU0", "avatar_url": "https://avatars.githubusercontent.com/u/50535454?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bent1e", "html_url": "https://github.com/bent1e", "followers_url": "https://api.github.com/users/bent1e/followers", "following_url": "https://api.github.com/users/bent1e/following{/other_user}", "gists_url": "https://api.github.com/users/bent1e/gists{/gist_id}", "starred_url": "https://api.github.com/users/bent1e/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bent1e/subscriptions", "organizations_url": "https://api.github.com/users/bent1e/orgs", "repos_url": "https://api.github.com/users/bent1e/repos", "events_url": "https://api.github.com/users/bent1e/events{/privacy}", "received_events_url": "https://api.github.com/users/bent1e/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null }, { "id": 5769473378, "node_id": "LA_kwDOCUB6oc8AAAABV-MtYg", "url": "https://api.github.com/repos/huggingface/transformers/labels/Vision", "name": "Vision", "color": "C079EF", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-10-16T07:10:10
2025-10-22T09:46:46
null
NONE
null
null
null
null
### System Info `transformers==4.57.1` ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ``` [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/generation/utils.py", line 2787, in _sample [rank0]: outputs = model_forward(**model_inputs, return_dict=True) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl [rank0]: return self._call_impl(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1845, in _call_impl [rank0]: return inner() [rank0]: ^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1793, in inner [rank0]: result = forward_call(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/utils/generic.py", line 1064, in wrapper [rank0]: outputs = func(self, *args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/models/qwen3_vl/modeling_qwen3_vl.py", line 1344, in forward [rank0]: outputs = self.model( [rank0]: ^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl [rank0]: return self._call_impl(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl [rank0]: return forward_call(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/utils/generic.py", line 1064, in wrapper [rank0]: outputs = func(self, *args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/models/qwen3_vl/modeling_qwen3_vl.py", line 1201, in forward [rank0]: position_ids, rope_deltas = self.get_rope_index( [rank0]: ^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/root/miniconda3/lib/python3.11/site-packages/transformers/models/qwen3_vl/modeling_qwen3_vl.py", line 949, in get_rope_index [rank0]: input_ids = input_ids[attention_mask[i] == 1] [rank0]: ~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: IndexError: The shape of the mask [719] at index 0 does not match the shape of the indexed tensor [1] at index 0 ``` ### Expected behavior error
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41643/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41643/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41642
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41642/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41642/comments
https://api.github.com/repos/huggingface/transformers/issues/41642/events
https://github.com/huggingface/transformers/pull/41642
3,520,588,066
PR_kwDOCUB6oc6uBMWy
41,642
Fix confusing cls assignment
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-16T07:02:16
2025-10-16T13:19:55
2025-10-16T13:01:08
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41642", "html_url": "https://github.com/huggingface/transformers/pull/41642", "diff_url": "https://github.com/huggingface/transformers/pull/41642.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41642.patch", "merged_at": "2025-10-16T13:01:08" }
# What does this PR do? Don't overwrite `cls` incidentally in `__new__`.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41642/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41642/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41641
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41641/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41641/comments
https://api.github.com/repos/huggingface/transformers/issues/41641/events
https://github.com/huggingface/transformers/pull/41641
3,520,546,902
PR_kwDOCUB6oc6uBDs_
41,641
Fix typos in documentation
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-16T06:45:47
2025-10-16T12:59:23
2025-10-16T12:58:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41641", "html_url": "https://github.com/huggingface/transformers/pull/41641", "diff_url": "https://github.com/huggingface/transformers/pull/41641.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41641.patch", "merged_at": "2025-10-16T12:58:46" }
# What does this PR do? As the title says. ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker - CIs: @ydshieh Integrations: - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization: @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41641/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41641/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41640
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41640/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41640/comments
https://api.github.com/repos/huggingface/transformers/issues/41640/events
https://github.com/huggingface/transformers/issues/41640
3,520,516,956
I_kwDOCUB6oc7R1tNc
41,640
AttributeError: BartTokenizerFast has no attribute image_token. Did you mean: 'mask_token'?
{ "login": "conceptofmind", "id": 25208228, "node_id": "MDQ6VXNlcjI1MjA4MjI4", "avatar_url": "https://avatars.githubusercontent.com/u/25208228?v=4", "gravatar_id": "", "url": "https://api.github.com/users/conceptofmind", "html_url": "https://github.com/conceptofmind", "followers_url": "https://api.github.com/users/conceptofmind/followers", "following_url": "https://api.github.com/users/conceptofmind/following{/other_user}", "gists_url": "https://api.github.com/users/conceptofmind/gists{/gist_id}", "starred_url": "https://api.github.com/users/conceptofmind/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/conceptofmind/subscriptions", "organizations_url": "https://api.github.com/users/conceptofmind/orgs", "repos_url": "https://api.github.com/users/conceptofmind/repos", "events_url": "https://api.github.com/users/conceptofmind/events{/privacy}", "received_events_url": "https://api.github.com/users/conceptofmind/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-16T06:34:02
2025-10-17T09:00:36
2025-10-17T09:00:36
NONE
null
null
null
null
### System Info Ubuntu ### Who can help? _No response_ ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ```python import torch import requests from PIL import Image from transformers import AutoProcessor, Florence2ForConditionalGeneration model = Florence2ForConditionalGeneration.from_pretrained( "microsoft/Florence-2-large", dtype=torch.bfloat16, ) processor = AutoProcessor.from_pretrained("microsoft/Florence-2-large") url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg?download=true" image = Image.open(requests.get(url, stream=True).raw).convert("RGB") task_prompt = "<OD>" inputs = processor(text=task_prompt, images=image, return_tensors="pt").to(model.device, torch.bfloat16) generated_ids = model.generate( **inputs, max_new_tokens=1024, num_beams=3, ) generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0] image_size = image.size parsed_answer = processor.post_process_generation(generated_text, task=task_prompt, image_size=image_size) print(parsed_answer) ``` ### Expected behavior ``` raise AttributeError(f"{self.__class__.__name__} has no attribute {key}") AttributeError: BartTokenizerFast has no attribute image_token. Did you mean: 'mask_token'? ```
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41640/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41640/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41639
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41639/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41639/comments
https://api.github.com/repos/huggingface/transformers/issues/41639/events
https://github.com/huggingface/transformers/issues/41639
3,520,084,630
I_kwDOCUB6oc7R0DqW
41,639
MarianMTModel performance regression due to Bidirectional masks
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-16T02:21:38
2025-10-16T16:06:44
null
CONTRIBUTOR
null
null
null
null
### System Info torch 2.10.0.dev20251008+cpu ### Who can help? @vasqu ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction `numactl -C 0-3 --membind 0 python test.py` ```python import time import torch from transformers import pipeline, AutoTokenizer model_id = "Helsinki-NLP/opus-mt-mul-en" tokenizer = AutoTokenizer.from_pretrained(model_id) generator = pipeline("translation_fr_to_en", model=model_id, dtype=torch.float16, tokenizer=tokenizer) generation_config = generator.model.generation_config generation_config.max_new_tokens = 128 generation_config.min_new_tokens = 128 generation_config.do_sample = False generation_config.temperature = 1.0 generation_config.num_beams = 1 generation_config.cache_implementation="static" input_sentence = "C'est fait et soumis. Vous pouvez jouer à « Survival of the Tastiest » sur Android, et sur le web. Le jeu sur le web fonctionne, mais vous devez simuler plusieurs touches pour déplacer les tables, ce qui peut être un peu déroutant. Il y a beaucoup de choses dont j'aimerais parler. Je vais passer en revue chaque sujet, au lieu de faire la liste habituelle de ce qui s'est bien passé et de ce qui s'est mal passé. Concept Travailler sur le thème a probablement été l'une des tâches les plus difficiles auxquelles j'ai dû faire face. A l'origine, j'avais une idée du type de jeu que je voulais développer, du point de vue du gameplay - quelque chose avec beaucoup d'ennemis/acteurs" for _ in range(5): output = generator(input_sentence, generation_config=generation_config) for _ in range(5): start = time.time() output = generator(input_sentence, generation_config=generation_config) end = time.time() print(f"pipeline latency: {(end-start)*1000} ms") ``` ### Expected behavior The performance has 40% regression after PR [41265](https://github.com/huggingface/transformers/pull/41265). Same issue as #41566 . But the fix PR #41586 only fixes the Bert model regression, and didn't fix MarianMTModel performance regression.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41639/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41639/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41638
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41638/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41638/comments
https://api.github.com/repos/huggingface/transformers/issues/41638/events
https://github.com/huggingface/transformers/pull/41638
3,519,951,732
PR_kwDOCUB6oc6t_J2K
41,638
Format MarkDown documentation and tiny fixes
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-16T00:49:55
2025-10-16T12:59:32
2025-10-16T12:58:07
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41638", "html_url": "https://github.com/huggingface/transformers/pull/41638", "diff_url": "https://github.com/huggingface/transformers/pull/41638.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41638.patch", "merged_at": "2025-10-16T12:58:07" }
# What does this PR do? This PR fixes MarkDown document issues detected by markdownlint-cli2. Some duplicated `autodoc` entries are also removed. ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker - CIs: @ydshieh Integrations: - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization: @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41638/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41638/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41637
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41637/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41637/comments
https://api.github.com/repos/huggingface/transformers/issues/41637/events
https://github.com/huggingface/transformers/pull/41637
3,519,769,122
PR_kwDOCUB6oc6t-keq
41,637
Update benchmark_v2_a10_caller.yml
{ "login": "raissalovesgudetama", "id": 236662036, "node_id": "U_kgDODhstFA", "avatar_url": "https://avatars.githubusercontent.com/u/236662036?v=4", "gravatar_id": "", "url": "https://api.github.com/users/raissalovesgudetama", "html_url": "https://github.com/raissalovesgudetama", "followers_url": "https://api.github.com/users/raissalovesgudetama/followers", "following_url": "https://api.github.com/users/raissalovesgudetama/following{/other_user}", "gists_url": "https://api.github.com/users/raissalovesgudetama/gists{/gist_id}", "starred_url": "https://api.github.com/users/raissalovesgudetama/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/raissalovesgudetama/subscriptions", "organizations_url": "https://api.github.com/users/raissalovesgudetama/orgs", "repos_url": "https://api.github.com/users/raissalovesgudetama/repos", "events_url": "https://api.github.com/users/raissalovesgudetama/events{/privacy}", "received_events_url": "https://api.github.com/users/raissalovesgudetama/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9258341780, "node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop", "name": "Code agent slop", "color": "C59579", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-15T22:33:40
2025-10-16T12:40:50
2025-10-16T12:40:50
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41637", "html_url": "https://github.com/huggingface/transformers/pull/41637", "diff_url": "https://github.com/huggingface/transformers/pull/41637.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41637.patch", "merged_at": null }
optional benchmark_tag input environment variables # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker - CIs: @ydshieh Integrations: - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization: @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41637/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41637/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41636
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41636/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41636/comments
https://api.github.com/repos/huggingface/transformers/issues/41636/events
https://github.com/huggingface/transformers/pull/41636
3,519,420,123
PR_kwDOCUB6oc6t9YJV
41,636
Adjust device logging level and add minor fixes
{ "login": "mario-koddenbrock", "id": 97628508, "node_id": "U_kgDOBdGxXA", "avatar_url": "https://avatars.githubusercontent.com/u/97628508?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mario-koddenbrock", "html_url": "https://github.com/mario-koddenbrock", "followers_url": "https://api.github.com/users/mario-koddenbrock/followers", "following_url": "https://api.github.com/users/mario-koddenbrock/following{/other_user}", "gists_url": "https://api.github.com/users/mario-koddenbrock/gists{/gist_id}", "starred_url": "https://api.github.com/users/mario-koddenbrock/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mario-koddenbrock/subscriptions", "organizations_url": "https://api.github.com/users/mario-koddenbrock/orgs", "repos_url": "https://api.github.com/users/mario-koddenbrock/repos", "events_url": "https://api.github.com/users/mario-koddenbrock/events{/privacy}", "received_events_url": "https://api.github.com/users/mario-koddenbrock/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T20:15:01
2025-10-16T12:48:22
2025-10-16T12:47:39
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41636", "html_url": "https://github.com/huggingface/transformers/pull/41636", "diff_url": "https://github.com/huggingface/transformers/pull/41636.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41636.patch", "merged_at": "2025-10-16T12:47:39" }
This commit addresses a noisy warning and improves the robustness of the base pipeline implementation. - The device placement message in the pipeline base class has been changed from a `warning` to a `debug` log. This reduces log noise for users who are aware of their device setup, while still providing the information for debugging purposes. - Additionally, potential `UnboundLocalError` exceptions in the `_pad` and `check_model_type` functions have been prevented by initializing variables before their conditional assignment.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41636/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41636/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41635
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41635/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41635/comments
https://api.github.com/repos/huggingface/transformers/issues/41635/events
https://github.com/huggingface/transformers/pull/41635
3,519,062,999
PR_kwDOCUB6oc6t8Kyk
41,635
Enforce check_auto_docstring
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T18:07:22
2025-10-16T21:51:51
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41635", "html_url": "https://github.com/huggingface/transformers/pull/41635", "diff_url": "https://github.com/huggingface/transformers/pull/41635.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41635.patch", "merged_at": null }
# What does this PR do? Fix some small issues with auto_docstring and raise an error instead of just warning if something is wrong when running check_auto_docstring
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41635/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41635/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41634
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41634/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41634/comments
https://api.github.com/repos/huggingface/transformers/issues/41634/events
https://github.com/huggingface/transformers/pull/41634
3,518,978,554
PR_kwDOCUB6oc6t74dm
41,634
[MODEL] Nanochat implementation
{ "login": "burtenshaw", "id": 19620375, "node_id": "MDQ6VXNlcjE5NjIwMzc1", "avatar_url": "https://avatars.githubusercontent.com/u/19620375?v=4", "gravatar_id": "", "url": "https://api.github.com/users/burtenshaw", "html_url": "https://github.com/burtenshaw", "followers_url": "https://api.github.com/users/burtenshaw/followers", "following_url": "https://api.github.com/users/burtenshaw/following{/other_user}", "gists_url": "https://api.github.com/users/burtenshaw/gists{/gist_id}", "starred_url": "https://api.github.com/users/burtenshaw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/burtenshaw/subscriptions", "organizations_url": "https://api.github.com/users/burtenshaw/orgs", "repos_url": "https://api.github.com/users/burtenshaw/repos", "events_url": "https://api.github.com/users/burtenshaw/events{/privacy}", "received_events_url": "https://api.github.com/users/burtenshaw/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T17:37:06
2025-10-27T13:37:36
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41634", "html_url": "https://github.com/huggingface/transformers/pull/41634", "diff_url": "https://github.com/huggingface/transformers/pull/41634.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41634.patch", "merged_at": null }
This PR introduces Andrej Karpathy's `nanochat` to transformers. The transformers compatible weights are currently available in two places: - A [pr](https://huggingface.co/karpathy/nanochat-d32/discussions/1) on Karpathy's repo (large model) - A [repo](https://huggingface.co/nanochat-students/nanochat-d20) on the community org (small model) # ToDo - CI is failing dues to tests for `SwiftFormerForQuestionAnswering` etc.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41634/reactions", "total_count": 11, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 11, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41634/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41633
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41633/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41633/comments
https://api.github.com/repos/huggingface/transformers/issues/41633/events
https://github.com/huggingface/transformers/pull/41633
3,518,945,924
PR_kwDOCUB6oc6t7xRI
41,633
[v5] 🚨Refactor subprocessors handling in processors
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T17:26:37
2025-10-22T19:44:30
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41633", "html_url": "https://github.com/huggingface/transformers/pull/41633", "diff_url": "https://github.com/huggingface/transformers/pull/41633.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41633.patch", "merged_at": null }
# What does this PR do? Refactor the handling of subprocessors in processors. - Main change is that we deduce the subprocessors from the init signature instead of having to manually add "subprocessor_class" attributes. - This means we can remove all `attributes` attribute in processors, along with all `"subprocessor"_class` attributes - We also now have one source of truth to determine which image processor will be loaded by default (the Auto sub processors classes) This PR is a requirement for https://github.com/huggingface/transformers/pull/41388, as otherwise we'd have to manually check that all `image_processor_class` attributes are set to "AutoImageProcessor" Cc @ArthurZucker @Cyrilvallez @zucchini-nlp @molbap (and also @ydshieh as this might break some parts of the CI 👀, although I checked that all processor tests are passing still, except kosmos2.5 but that's because of a PIL.UnidentifiedImageError ;)). Update: I'm seeing some tests breaking in `test_processor_auto.py`, related to registering custom processors and subprocessors in transformers. How used is this and can we break it slightly for v5? 👀 Update 2: It looks like it's not really a problem. The only edge case that will break is if a custom processor was defined by inheriting from `ProcessorMixin`, without overriding `__init__`. ProcessorMixin used to have "feature_extractor" and "tokenizer" attributes by default, now it doesn't (which makes more sense imo) Fixed the tests by modifying the custom processor on the hub to add an init. 🚨Breaking change: - If a model was saved with a processor, and another processor is used to load the checkpoint, the subprocessors loaded used to be the ones hardcoded in the other processor class definition, but now they will be the ones that were saved originally. For example: ```python processor = OwlViTProcessor.from_pretrained("some_owlv2_checkpoint") print(type(processor.image_processor) # Used to be OwlViTImageProcessor, will now be Owlv2ImageProcessor, which makes more sense in my opinion ```
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41633/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/41633/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41632
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41632/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41632/comments
https://api.github.com/repos/huggingface/transformers/issues/41632/events
https://github.com/huggingface/transformers/pull/41632
3,518,894,966
PR_kwDOCUB6oc6t7mJ8
41,632
[CI] Build translated docs
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T17:09:25
2025-10-16T12:01:35
2025-10-16T12:01:33
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41632", "html_url": "https://github.com/huggingface/transformers/pull/41632", "diff_url": "https://github.com/huggingface/transformers/pull/41632.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41632.patch", "merged_at": "2025-10-16T12:01:33" }
Actually, this might be the reason why the translated docs aren't building when commenting `build-doc`. The `tr` and `te` language codes don't exist 😅
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41632/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41632/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41631
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41631/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41631/comments
https://api.github.com/repos/huggingface/transformers/issues/41631/events
https://github.com/huggingface/transformers/pull/41631
3,518,819,124
PR_kwDOCUB6oc6t7VlP
41,631
Incorrect access of dataset field fixed
{ "login": "vignesh1507", "id": 143084478, "node_id": "U_kgDOCIdLvg", "avatar_url": "https://avatars.githubusercontent.com/u/143084478?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vignesh1507", "html_url": "https://github.com/vignesh1507", "followers_url": "https://api.github.com/users/vignesh1507/followers", "following_url": "https://api.github.com/users/vignesh1507/following{/other_user}", "gists_url": "https://api.github.com/users/vignesh1507/gists{/gist_id}", "starred_url": "https://api.github.com/users/vignesh1507/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vignesh1507/subscriptions", "organizations_url": "https://api.github.com/users/vignesh1507/orgs", "repos_url": "https://api.github.com/users/vignesh1507/repos", "events_url": "https://api.github.com/users/vignesh1507/events{/privacy}", "received_events_url": "https://api.github.com/users/vignesh1507/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T16:42:35
2025-10-16T13:45:44
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41631", "html_url": "https://github.com/huggingface/transformers/pull/41631", "diff_url": "https://github.com/huggingface/transformers/pull/41631.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41631.patch", "merged_at": null }
In `test_tokenizer`: `for text in dataset[i]["premise"].values(): test_string(slow, fast, text) # hypothesis, all languages for text in dataset[i]["hypothesis"]["translation"]: test_string(slow, fast, text)` The "premise" field in the XNLI dataset is a string, not a dictionary**. Calling .values() on a string will raise an error. Similarly, "hypothesis" is also a string (not a dictionary), so "hypothesis"]["translation" will fail.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41631/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41631/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41630
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41630/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41630/comments
https://api.github.com/repos/huggingface/transformers/issues/41630/events
https://github.com/huggingface/transformers/issues/41630
3,518,806,878
I_kwDOCUB6oc7RvLte
41,630
Incorrect Access of Dataset Fields in check_tokenizers.py
{ "login": "vignesh1507", "id": 143084478, "node_id": "U_kgDOCIdLvg", "avatar_url": "https://avatars.githubusercontent.com/u/143084478?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vignesh1507", "html_url": "https://github.com/vignesh1507", "followers_url": "https://api.github.com/users/vignesh1507/followers", "following_url": "https://api.github.com/users/vignesh1507/following{/other_user}", "gists_url": "https://api.github.com/users/vignesh1507/gists{/gist_id}", "starred_url": "https://api.github.com/users/vignesh1507/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vignesh1507/subscriptions", "organizations_url": "https://api.github.com/users/vignesh1507/orgs", "repos_url": "https://api.github.com/users/vignesh1507/repos", "events_url": "https://api.github.com/users/vignesh1507/events{/privacy}", "received_events_url": "https://api.github.com/users/vignesh1507/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T16:38:04
2025-10-17T14:41:40
null
NONE
null
null
null
null
The `premise` field in the XNLI dataset is a string, not a dictionary**. Calling `.values()` on a string will raise an error. Similarly, `hypothesis` is also a string (not a dictionary), so `"hypothesis"]["translation"` will fail.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41630/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41630/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41629
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41629/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41629/comments
https://api.github.com/repos/huggingface/transformers/issues/41629/events
https://github.com/huggingface/transformers/pull/41629
3,518,799,131
PR_kwDOCUB6oc6t7ROg
41,629
Make Idefics exportable, assume no padding
{ "login": "jackzhxng", "id": 32371937, "node_id": "MDQ6VXNlcjMyMzcxOTM3", "avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jackzhxng", "html_url": "https://github.com/jackzhxng", "followers_url": "https://api.github.com/users/jackzhxng/followers", "following_url": "https://api.github.com/users/jackzhxng/following{/other_user}", "gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions", "organizations_url": "https://api.github.com/users/jackzhxng/orgs", "repos_url": "https://api.github.com/users/jackzhxng/repos", "events_url": "https://api.github.com/users/jackzhxng/events{/privacy}", "received_events_url": "https://api.github.com/users/jackzhxng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T16:35:37
2025-10-15T22:35:38
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41629", "html_url": "https://github.com/huggingface/transformers/pull/41629", "diff_url": "https://github.com/huggingface/transformers/pull/41629.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41629.patch", "merged_at": null }
# What does this PR do? Makes Idefics 3 exportable by assuming batch size of 1, unrolling the loop to a single iteration. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. https://github.com/huggingface/optimum-executorch/pull/163#discussion_r2429871579 - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @zucchini-nlp
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41629/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41629/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41628
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41628/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41628/comments
https://api.github.com/repos/huggingface/transformers/issues/41628/events
https://github.com/huggingface/transformers/issues/41628
3,518,780,612
I_kwDOCUB6oc7RvFTE
41,628
Cannot import name 'AutoImageProcessor' from 'transformers'
{ "login": "Pittmann-XIE", "id": 103981664, "node_id": "U_kgDOBjKiYA", "avatar_url": "https://avatars.githubusercontent.com/u/103981664?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Pittmann-XIE", "html_url": "https://github.com/Pittmann-XIE", "followers_url": "https://api.github.com/users/Pittmann-XIE/followers", "following_url": "https://api.github.com/users/Pittmann-XIE/following{/other_user}", "gists_url": "https://api.github.com/users/Pittmann-XIE/gists{/gist_id}", "starred_url": "https://api.github.com/users/Pittmann-XIE/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Pittmann-XIE/subscriptions", "organizations_url": "https://api.github.com/users/Pittmann-XIE/orgs", "repos_url": "https://api.github.com/users/Pittmann-XIE/repos", "events_url": "https://api.github.com/users/Pittmann-XIE/events{/privacy}", "received_events_url": "https://api.github.com/users/Pittmann-XIE/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-15T16:29:20
2025-10-17T07:07:59
2025-10-16T12:37:07
NONE
null
null
null
null
### System Info Intel CPU Nvidia 3090 ubuntu 22.04 python 3.10.12 transformers=5.0.0.dev0 (installed from the official git repo) ### PS: It's also tested with transformers=4.57.1, which is installed using "pip install", the same error persisted while executing "from transformers import AutoImageProcessor, AutoModel". ### Who can help? _No response_ ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction 1. in any python script: from transformers import AutoImageProcessor, AutoModel ### Expected behavior it shouldn't raise the error: Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: cannot import name 'AutoImageProcessor' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py)
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41628/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41628/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41627
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41627/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41627/comments
https://api.github.com/repos/huggingface/transformers/issues/41627/events
https://github.com/huggingface/transformers/pull/41627
3,518,624,861
PR_kwDOCUB6oc6t6q0C
41,627
[`Executorch`] Simplify for encoder models
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T15:49:13
2025-10-20T14:15:16
2025-10-16T11:57:52
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41627", "html_url": "https://github.com/huggingface/transformers/pull/41627", "diff_url": "https://github.com/huggingface/transformers/pull/41627.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41627.patch", "merged_at": "2025-10-16T11:57:52" }
Now that we include a fast path using no vmapping, we can revert the extra treatment. Followup to #41586
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41627/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41627/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41626
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41626/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41626/comments
https://api.github.com/repos/huggingface/transformers/issues/41626/events
https://github.com/huggingface/transformers/pull/41626
3,518,452,917
PR_kwDOCUB6oc6t6Fp6
41,626
[v5] Return a BatchEncoding dict from apply_chat_template by default
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T15:00:46
2025-10-22T16:40:31
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41626", "html_url": "https://github.com/huggingface/transformers/pull/41626", "diff_url": "https://github.com/huggingface/transformers/pull/41626.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41626.patch", "merged_at": null }
Tokenizers return a BatchEncoding dict by default, but `apply_chat_template` doesn't. This is just an accident of how I wrote it originally, which we were stuck with for backward compatibility reasons. Ideally, I think `apply_chat_template` should return exactly the same format as tokenizers, since it also performs tokenization most of the time. It's now `v5` time, so we can start making that happen :sweat_smile: This PR also updates tests, and removes very old `test_tokenization_for_chat` tests. These model-specific tests don't do anything useful anymore, since the `apply_chat_template` functionality is unified across tokenizers; they're mostly a legacy leftover from when model classes used to need custom chat tokenization functions.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41626/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41626/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41625
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41625/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41625/comments
https://api.github.com/repos/huggingface/transformers/issues/41625/events
https://github.com/huggingface/transformers/pull/41625
3,518,451,207
PR_kwDOCUB6oc6t6FSV
41,625
[`Masks`] Fix mask handling in eager for vision models
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5769473378, "node_id": "LA_kwDOCUB6oc8AAAABV-MtYg", "url": "https://api.github.com/repos/huggingface/transformers/labels/Vision", "name": "Vision", "color": "C079EF", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-15T15:00:17
2025-10-16T14:27:30
2025-10-16T14:27:26
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41625", "html_url": "https://github.com/huggingface/transformers/pull/41625", "diff_url": "https://github.com/huggingface/transformers/pull/41625.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41625.patch", "merged_at": "2025-10-16T14:27:26" }
As per title, some vision models use masks, let's sync with bert this time and reduce errors that could be introduced
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41625/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41625/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41624
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41624/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41624/comments
https://api.github.com/repos/huggingface/transformers/issues/41624/events
https://github.com/huggingface/transformers/pull/41624
3,518,413,874
PR_kwDOCUB6oc6t59M4
41,624
Fix serving continuous batching
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T14:49:47
2025-10-16T15:24:24
2025-10-16T15:24:22
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41624", "html_url": "https://github.com/huggingface/transformers/pull/41624", "diff_url": "https://github.com/huggingface/transformers/pull/41624.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41624.patch", "merged_at": "2025-10-16T15:24:22" }
# What does this PR do? Serving is broken with continuous batching to due recent PRs. This PR fixes it
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41624/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41624/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41623
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41623/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41623/comments
https://api.github.com/repos/huggingface/transformers/issues/41623/events
https://github.com/huggingface/transformers/issues/41623
3,518,391,141
I_kwDOCUB6oc7RtmNl
41,623
Qwen Audio DisjunctiveConstraint missing
{ "login": "Znerual", "id": 22452386, "node_id": "MDQ6VXNlcjIyNDUyMzg2", "avatar_url": "https://avatars.githubusercontent.com/u/22452386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Znerual", "html_url": "https://github.com/Znerual", "followers_url": "https://api.github.com/users/Znerual/followers", "following_url": "https://api.github.com/users/Znerual/following{/other_user}", "gists_url": "https://api.github.com/users/Znerual/gists{/gist_id}", "starred_url": "https://api.github.com/users/Znerual/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Znerual/subscriptions", "organizations_url": "https://api.github.com/users/Znerual/orgs", "repos_url": "https://api.github.com/users/Znerual/repos", "events_url": "https://api.github.com/users/Znerual/events{/privacy}", "received_events_url": "https://api.github.com/users/Znerual/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-15T14:44:20
2025-10-15T16:01:03
null
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.57.0.dev0 - Platform: Linux-6.14.0-33-generic-x86_64-with-glibc2.39 - Python version: 3.12.11 - Huggingface_hub version: 1.0.0.rc5 - Safetensors version: 0.6.2 - Accelerate version: 1.10.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.8.0+cu128 (CUDA) - Using distributed or parallel set-up in script?: no - Using GPU in script?: yes - GPU type: NVIDIA RTX PRO 2000 Blackwell Generation Laptop GPU ### Who can help? @eustlb @ebezzam @vasqu ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Take the example code from the website: ```python from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation import GenerationConfig import torch torch.manual_seed(1234) tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-Audio", trust_remote_code=True) # use bf16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-Audio", device_map="auto", trust_remote_code=True, bf16=True).eval() # use fp16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-Audio", device_map="auto", trust_remote_code=True, fp16=True).eval() # use cpu only # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-Audio", device_map="cpu", trust_remote_code=True).eval() # use cuda device model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-Audio", device_map="cuda", trust_remote_code=True).eval() # Specify hyperparameters for generation (No need to do this if you are using transformers>4.32.0) # model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-Audio", trust_remote_code=True) audio_url = "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-Audio/1272-128104-0000.flac" sp_prompt = "<|startoftranscript|><|en|><|transcribe|><|en|><|notimestamps|><|wo_itn|>" query = f"<audio>{audio_url}</audio>{sp_prompt}" audio_info = tokenizer.process_audio(query) inputs = tokenizer(query, return_tensors='pt', audio_info=audio_info) inputs = inputs.to(model.device) pred = model.generate(**inputs, audio_info=audio_info) response = tokenizer.decode(pred.cpu()[0], skip_special_tokens=False,audio_info=audio_info) print(response) # <audio>https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-Audio/1272-128104-0000.flac</audio><|startoftranscription|><|en|><|transcribe|><|en|><|notimestamps|><|wo_itn|>mister quilting is the apostle of the middle classes and we are glad to welcome his gospel<|endoftext|> ``` results in the error: ``` tokenizer_config.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 211/211 [00:00<00:00, 753kB/s] tokenization_qwen.py: 22.8kB [00:00, 32.0MB/s] audio.py: 15.0kB [00:00, 16.0MB/s] A new version of the following files was downloaded from https://huggingface.co/Qwen/Qwen-Audio: - audio.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. A new version of the following files was downloaded from https://huggingface.co/Qwen/Qwen-Audio: - tokenization_qwen.py - audio.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. qwen.tiktoken: 2.56MB [00:00, 52.5MB/s] special_tokens_map.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.00/3.00 [00:00<00:00, 8.41kB/s] audio_start_id: 155163, audio_end_id: 155164, audio_pad_id: 151851. config.json: 1.26kB [00:00, 2.82MB/s] configuration_qwen.py: 2.35kB [00:00, 5.27MB/s] A new version of the following files was downloaded from https://huggingface.co/Qwen/Qwen-Audio: - configuration_qwen.py . Make sure to double-check they do not contain any added malicious code. To avoid downloading new versions of the code file, you can pin a revision. modeling_qwen.py: 58.3kB [00:00, 59.1MB/s] Encountered exception while importing transformers_stream_generator: cannot import name 'DisjunctiveConstraint' from 'transformers' (/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/__init__.py) Traceback (most recent call last): File "/home/ruzickal/Code/entity-analyzer-backend/audio-llm-api/test_qwen_audio.py", line 15, in <module> model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-Audio", device_map="cuda", trust_remote_code=True).eval() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/models/auto/auto_factory.py", line 367, in from_pretrained model_class = get_class_from_dynamic_module( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/dynamic_module_utils.py", line 594, in get_class_from_dynamic_module final_module = get_cached_module_file( ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/dynamic_module_utils.py", line 422, in get_cached_module_file modules_needed = check_imports(resolved_module_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/dynamic_module_utils.py", line 248, in check_imports importlib.import_module(imp) File "/home/ruzickal/.pyenv/versions/3.12.11/lib/python3.12/importlib/__init__.py", line 90, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "<frozen importlib._bootstrap>", line 1387, in _gcd_import File "<frozen importlib._bootstrap>", line 1360, in _find_and_load File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 935, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 999, in exec_module File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed File "/home/ruzickal/.cache/pypoetry/virtualenvs/audio-llm-api-nPR59pGu-py3.12/lib/python3.12/site-packages/transformers_stream_generator/__init__.py", line 1, in <module> from .main import init_stream_support File "/home/ruzickal/.cache/pypoetry/virtualenvs/audio-llm-api-nPR59pGu-py3.12/lib/python3.12/site-packages/transformers_stream_generator/main.py", line 1, in <module> from transformers import ( ImportError: cannot import name 'DisjunctiveConstraint' from 'transformers' (/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/__init__.py) ``` ### Expected behavior Run the example code without an error
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41623/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41623/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41622
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41622/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41622/comments
https://api.github.com/repos/huggingface/transformers/issues/41622/events
https://github.com/huggingface/transformers/issues/41622
3,518,367,054
I_kwDOCUB6oc7RtgVO
41,622
Florence 2 _supports_sdpa missing in remote code
{ "login": "Znerual", "id": 22452386, "node_id": "MDQ6VXNlcjIyNDUyMzg2", "avatar_url": "https://avatars.githubusercontent.com/u/22452386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Znerual", "html_url": "https://github.com/Znerual", "followers_url": "https://api.github.com/users/Znerual/followers", "following_url": "https://api.github.com/users/Znerual/following{/other_user}", "gists_url": "https://api.github.com/users/Znerual/gists{/gist_id}", "starred_url": "https://api.github.com/users/Znerual/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Znerual/subscriptions", "organizations_url": "https://api.github.com/users/Znerual/orgs", "repos_url": "https://api.github.com/users/Znerual/repos", "events_url": "https://api.github.com/users/Znerual/events{/privacy}", "received_events_url": "https://api.github.com/users/Znerual/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-15T14:38:12
2025-10-16T09:09:29
2025-10-16T09:09:29
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.57.0.dev0 - Platform: Linux-6.14.0-33-generic-x86_64-with-glibc2.39 - Python version: 3.12.11 - Huggingface_hub version: 1.0.0.rc5 - Safetensors version: 0.6.2 - Accelerate version: 1.10.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.8.0+cu128 (CUDA) - Using distributed or parallel set-up in script?: no - Using GPU in script?: yes - GPU type: NVIDIA RTX PRO 2000 Blackwell Generation Laptop GPU ### Who can help? @yonigozlan @molbap ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Running the example script: ```python import requests import torch from PIL import Image from transformers import AutoProcessor, AutoModelForCausalLM device = "cuda:0" if torch.cuda.is_available() else "cpu" torch_dtype = torch.float16 if torch.cuda.is_available() else torch.float32 model = AutoModelForCausalLM.from_pretrained("microsoft/Florence-2-large", torch_dtype=torch_dtype, trust_remote_code=True).to(device) processor = AutoProcessor.from_pretrained("microsoft/Florence-2-large", trust_remote_code=False) prompt = "<OD>" url = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/car.jpg?download=true" image = Image.open(requests.get(url, stream=True).raw) inputs = processor(text=prompt, images=image, return_tensors="pt").to(device, torch_dtype) generated_ids = model.generate( input_ids=inputs["input_ids"], pixel_values=inputs["pixel_values"], max_new_tokens=4096, num_beams=3, do_sample=False ) generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0] parsed_answer = processor.post_process_generation(generated_text, task="<OD>", image_size=(image.width, image.height)) print(parsed_answer) ``` Leads to the following error: ``` `torch_dtype` is deprecated! Use `dtype` instead! Traceback (most recent call last): File "/home/ruzickal/Code/entity-analyzer-backend/audio-llm-api/test.py", line 11, in <module> model = AutoModelForCausalLM.from_pretrained("microsoft/Florence-2-large", torch_dtype=torch_dtype, trust_remote_code=True).to(device) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/models/auto/auto_factory.py", line 378, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 270, in _wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 4473, in from_pretrained model = cls(config, *model_args, **model_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/.cache/huggingface/modules/transformers_modules/microsoft/Florence_hyphen_2_hyphen_large/21a599d414c4d928c9032694c424fb94458e3594/modeling_florence2.py", line 2535, in __init__ super().__init__(config) File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 1822, in __init__ self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 2416, in _check_and_adjust_attn_implementation applicable_attn_implementation = self.get_correct_attn_implementation( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 2450, in get_correct_attn_implementation self._sdpa_can_dispatch(is_init_check) File "/home/ruzickal/Code/entity-analyzer-backend/transformers/src/transformers/modeling_utils.py", line 2314, in _sdpa_can_dispatch if not self._supports_sdpa: ^^^^^^^^^^^^^^^^^^^ File "/home/ruzickal/.cache/pypoetry/virtualenvs/audio-llm-api-nPR59pGu-py3.12/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1962, in __getattr__ raise AttributeError( AttributeError: 'Florence2ForConditionalGeneration' object has no attribute '_supports_sdpa' ``` ### Expected behavior Output the detections
{ "login": "Znerual", "id": 22452386, "node_id": "MDQ6VXNlcjIyNDUyMzg2", "avatar_url": "https://avatars.githubusercontent.com/u/22452386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Znerual", "html_url": "https://github.com/Znerual", "followers_url": "https://api.github.com/users/Znerual/followers", "following_url": "https://api.github.com/users/Znerual/following{/other_user}", "gists_url": "https://api.github.com/users/Znerual/gists{/gist_id}", "starred_url": "https://api.github.com/users/Znerual/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Znerual/subscriptions", "organizations_url": "https://api.github.com/users/Znerual/orgs", "repos_url": "https://api.github.com/users/Znerual/repos", "events_url": "https://api.github.com/users/Znerual/events{/privacy}", "received_events_url": "https://api.github.com/users/Znerual/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41622/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41622/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41621
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41621/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41621/comments
https://api.github.com/repos/huggingface/transformers/issues/41621/events
https://github.com/huggingface/transformers/pull/41621
3,518,302,268
PR_kwDOCUB6oc6t5lAk
41,621
Add LightOnOCR model implementation
{ "login": "baptiste-aubertin", "id": 47100525, "node_id": "MDQ6VXNlcjQ3MTAwNTI1", "avatar_url": "https://avatars.githubusercontent.com/u/47100525?v=4", "gravatar_id": "", "url": "https://api.github.com/users/baptiste-aubertin", "html_url": "https://github.com/baptiste-aubertin", "followers_url": "https://api.github.com/users/baptiste-aubertin/followers", "following_url": "https://api.github.com/users/baptiste-aubertin/following{/other_user}", "gists_url": "https://api.github.com/users/baptiste-aubertin/gists{/gist_id}", "starred_url": "https://api.github.com/users/baptiste-aubertin/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/baptiste-aubertin/subscriptions", "organizations_url": "https://api.github.com/users/baptiste-aubertin/orgs", "repos_url": "https://api.github.com/users/baptiste-aubertin/repos", "events_url": "https://api.github.com/users/baptiste-aubertin/events{/privacy}", "received_events_url": "https://api.github.com/users/baptiste-aubertin/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" }, { "id": 5769473378, "node_id": "LA_kwDOCUB6oc8AAAABV-MtYg", "url": "https://api.github.com/repos/huggingface/transformers/labels/Vision", "name": "Vision", "color": "C079EF", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-10-15T14:23:41
2025-10-29T18:50:23
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41621", "html_url": "https://github.com/huggingface/transformers/pull/41621", "diff_url": "https://github.com/huggingface/transformers/pull/41621.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41621.patch", "merged_at": null }
# What does this PR do? Implementation of the LightonOCR model following Modular Transformers architecture. Our model is a 1B parameter OCR model using Pixtral as the vision encoder and Qwen3 as the LLM decoder. I still have an issue with auto configuration, which I'm not familiar with: `🚨 Config not found for lightonocr. You can manually add it to HARDCODED_CONFIG_FOR_MODELS in utils/auto_docstring.py` <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker - CIs: @ydshieh Integrations: - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization: @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41621/reactions", "total_count": 9, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 6, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41621/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41620
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41620/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41620/comments
https://api.github.com/repos/huggingface/transformers/issues/41620/events
https://github.com/huggingface/transformers/pull/41620
3,518,122,519
PR_kwDOCUB6oc6t49nL
41,620
Remove the head masking block in some vision models
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T13:43:57
2025-10-15T13:52:56
2025-10-15T13:51:01
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41620", "html_url": "https://github.com/huggingface/transformers/pull/41620", "diff_url": "https://github.com/huggingface/transformers/pull/41620.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41620.patch", "merged_at": "2025-10-15T13:51:01" }
# What does this PR do? #36545 introduced a mistake regarding the headmask before that PR, we had some ``` # Mask heads if we want to if head_mask is not None: attention_probs = attention_probs * head_mask ``` but after the PR, it became ``` # Mask heads if we want to if attention_mask is not None: attn_weights = attn_weights * attention_mask ``` which is apparently wrong. Furthermore, we already remove the support of head masking. This PRs simply remove the blocks.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41620/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41620/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41619
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41619/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41619/comments
https://api.github.com/repos/huggingface/transformers/issues/41619/events
https://github.com/huggingface/transformers/pull/41619
3,518,010,782
PR_kwDOCUB6oc6t4k-d
41,619
Fix FP-Quant quantization fallback CPU dispatch.
{ "login": "BlackSamorez", "id": 16901341, "node_id": "MDQ6VXNlcjE2OTAxMzQx", "avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BlackSamorez", "html_url": "https://github.com/BlackSamorez", "followers_url": "https://api.github.com/users/BlackSamorez/followers", "following_url": "https://api.github.com/users/BlackSamorez/following{/other_user}", "gists_url": "https://api.github.com/users/BlackSamorez/gists{/gist_id}", "starred_url": "https://api.github.com/users/BlackSamorez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BlackSamorez/subscriptions", "organizations_url": "https://api.github.com/users/BlackSamorez/orgs", "repos_url": "https://api.github.com/users/BlackSamorez/repos", "events_url": "https://api.github.com/users/BlackSamorez/events{/privacy}", "received_events_url": "https://api.github.com/users/BlackSamorez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T13:16:50
2025-10-16T11:41:40
2025-10-16T11:41:02
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41619", "html_url": "https://github.com/huggingface/transformers/pull/41619", "diff_url": "https://github.com/huggingface/transformers/pull/41619.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41619.patch", "merged_at": "2025-10-16T11:41:02" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Added a workaround for a hard-coded missing key CPU dispatch that broke pre-quantized model loading for FP-Quant. `weight` was assumed to be missing and auto-dispatched to CPU but it's a parameter that gets set to `None` in post-loading (in `pre_forward`). Fixed FP-Quant test typos. Blocked by #41613 too. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker - CIs: @ydshieh Integrations: - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization: @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41619/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41619/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41618
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41618/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41618/comments
https://api.github.com/repos/huggingface/transformers/issues/41618/events
https://github.com/huggingface/transformers/pull/41618
3,517,833,888
PR_kwDOCUB6oc6t3-HV
41,618
Update a dataset reop link
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T12:34:41
2025-10-15T12:47:31
2025-10-15T12:41:38
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41618", "html_url": "https://github.com/huggingface/transformers/pull/41618", "diff_url": "https://github.com/huggingface/transformers/pull/41618.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41618.patch", "merged_at": "2025-10-15T12:41:38" }
# What does this PR do? `https://huggingface.co/kirp/kosmos2_5/resolve/main/receipt_00008.png` is no longer available (likely the author deleted it). That is used during the PR. Let's use the official repo. now https://huggingface.co/microsoft/kosmos-2.5/resolve/main/receipt_00008.png
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41618/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41618/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41617
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41617/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41617/comments
https://api.github.com/repos/huggingface/transformers/issues/41617/events
https://github.com/huggingface/transformers/pull/41617
3,517,755,260
PR_kwDOCUB6oc6t3tSo
41,617
Reinstate early CUDA init fix
{ "login": "hmellor", "id": 19981378, "node_id": "MDQ6VXNlcjE5OTgxMzc4", "avatar_url": "https://avatars.githubusercontent.com/u/19981378?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hmellor", "html_url": "https://github.com/hmellor", "followers_url": "https://api.github.com/users/hmellor/followers", "following_url": "https://api.github.com/users/hmellor/following{/other_user}", "gists_url": "https://api.github.com/users/hmellor/gists{/gist_id}", "starred_url": "https://api.github.com/users/hmellor/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hmellor/subscriptions", "organizations_url": "https://api.github.com/users/hmellor/orgs", "repos_url": "https://api.github.com/users/hmellor/repos", "events_url": "https://api.github.com/users/hmellor/events{/privacy}", "received_events_url": "https://api.github.com/users/hmellor/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T12:14:45
2025-10-15T12:43:59
2025-10-15T12:41:10
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41617", "html_url": "https://github.com/huggingface/transformers/pull/41617", "diff_url": "https://github.com/huggingface/transformers/pull/41617.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41617.patch", "merged_at": "2025-10-15T12:41:10" }
vLLM found that this import was causing CUDA to be initialised early
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41617/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41617/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41616
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41616/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41616/comments
https://api.github.com/repos/huggingface/transformers/issues/41616/events
https://github.com/huggingface/transformers/pull/41616
3,517,748,213
PR_kwDOCUB6oc6t3r9-
41,616
Remove deprecated code
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T12:13:16
2025-10-15T14:57:54
2025-10-15T14:57:52
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41616", "html_url": "https://github.com/huggingface/transformers/pull/41616", "diff_url": "https://github.com/huggingface/transformers/pull/41616.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41616.patch", "merged_at": "2025-10-15T14:57:52" }
# What does this PR do? As per title
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41616/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41616/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41615
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41615/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41615/comments
https://api.github.com/repos/huggingface/transformers/issues/41615/events
https://github.com/huggingface/transformers/issues/41615
3,517,670,975
I_kwDOCUB6oc7Rq2Y_
41,615
Issue Report: Abnormal GPU Utilization Pattern - DDP Training CLIP Model from Scratch
{ "login": "xiehuanyi", "id": 74809554, "node_id": "MDQ6VXNlcjc0ODA5NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/74809554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xiehuanyi", "html_url": "https://github.com/xiehuanyi", "followers_url": "https://api.github.com/users/xiehuanyi/followers", "following_url": "https://api.github.com/users/xiehuanyi/following{/other_user}", "gists_url": "https://api.github.com/users/xiehuanyi/gists{/gist_id}", "starred_url": "https://api.github.com/users/xiehuanyi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xiehuanyi/subscriptions", "organizations_url": "https://api.github.com/users/xiehuanyi/orgs", "repos_url": "https://api.github.com/users/xiehuanyi/repos", "events_url": "https://api.github.com/users/xiehuanyi/events{/privacy}", "received_events_url": "https://api.github.com/users/xiehuanyi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T11:52:04
2025-10-17T00:50:03
null
NONE
null
null
null
null
## Problem Description Encountering abnormal GPU utilization patterns when training a CLIP model from scratch using DDP (Distributed Data Parallel). The monitoring charts show: - **GPU Memory Allocation**: All 8 GPUs maintain a stable ~10% memory allocation - **GPU Utilization**: Highly irregular fluctuation pattern, oscillating between 0-100% with significant idle periods This pattern suggests potential issues with: 1. GPU synchronization/waiting problems 2. Data loading bottleneck 3. Gradient accumulation/communication issues 4. Torch compile or Flash Attention compatibility problems ## Environment Information ```yaml Hardware: - GPU: 8x A100 - Node: Single node Software: - Framework: PyTorch + Transformers - Distributed: DDP (NCCL backend) - Mixed Precision: BF16 - Optimizer: AdamW Fused - Special Features: - torch_compile=True - flash_attention_2 - gradient_checkpointing=False ``` ## Training Configuration ```python # Key Parameters per_device_batch_size: 128 gradient_accumulation_steps: 1 (calculated) target_global_batch_size: 1024 learning_rate: 1e-3 dataloader_num_workers: 8 dataloader_prefetch_factor: 4 dataloader_persistent_workers: True ``` ## Reproduction Code <details> <summary>Complete Training Script (Click to expand)</summary> ```python # -*- coding: utf-8 -*- import os import torch from datasets import load_dataset from transformers import ( CLIPProcessor, CLIPModel, CLIPConfig, TrainingArguments, Trainer ) from utils import ( count_parameters, create_cc_dataset, parse_args, create_laion_dataset ) def is_main_process(): """Check if current process is main process""" return int(os.environ.get('LOCAL_RANK', 0)) == 0 def collate_fn(batch, processor): if 'text' in batch[0]: texts = [b['text'] for b in batch] elif 'conversations' in batch[0]: texts = [b['conversations'][1]['value'] for b in batch] images = [b['image'] for b in batch] return processor( text=texts, images=images, return_tensors="pt", padding=True, truncation=True, max_length=77 ) def collate_fn_laion(batch, processor): texts = [b['txt'] for b in batch] images = [b['jpg'] for b in batch] out = processor( text=texts, images=images, return_tensors="pt", padding="max_length", truncation=True, max_length=77 ) return out def train(args): world_size = int(os.environ.get('WORLD_SIZE', 1)) rank = int(os.environ.get('RANK', 0)) local_rank = int(os.environ.get('LOCAL_RANK', 0)) print(f"[Rank {rank}] World size: {world_size}, Local rank: {local_rank}") print(f"[Rank {rank}] CUDA devices: {torch.cuda.device_count()}") # Set random seed torch.manual_seed(args.seed) if is_main_process(): print("==== Config ====") for k, v in vars(args).items(): print(f"{k}: {v}") # Performance optimization: Enable TF32 torch.backends.cuda.matmul.allow_tf32 = True torch.backends.cudnn.allow_tf32 = True # Model initialization cfg = CLIPConfig.from_pretrained( args.target_model, _attn_implementation='flash_attention_2', torch_dtype=torch.bfloat16 ) model = CLIPModel(config=cfg) nparams = count_parameters(model) if is_main_process(): print(f"Model Parameters: {nparams}") # Initialize processor processor = CLIPProcessor.from_pretrained(args.target_model, use_fast=False) # Load dataset if is_main_process(): print("[info] Loading dataset ...") if "cc12m" in args.data_path: train_dataset = create_cc_dataset( "data/cc8m_metadata.jsonl", os.path.join(args.data_path, 'cc8m'), num_data=args.num_data ) dname = 'cc12m' elif 'mscoco' in args.data_path: train_dataset = load_dataset(args.data_path)['train'] dname = 'mscoco' elif 'laion' in args.data_path.lower(): train_dataset = create_laion_dataset(args.data_path, num_data=args.num_data) dname = 'laion' # Calculate parameters based on configuration # Target: BS=1024, LR=1e-3, Warmup=10K, Samples=3B total_gpus = world_size target_global_bs = 1024 per_device_bs = 128 gradient_accumulation = target_global_bs // (per_device_bs * total_gpus) learning_rate = 1e-3 warmup_steps = 10000 total_samples = 3_000_000_000 total_steps = total_samples // target_global_bs if is_main_process(): print(f"[info] Target global batch size: {target_global_bs}") print(f"[info] Per device batch size: {per_device_bs}") print(f"[info] Gradient accumulation steps: {gradient_accumulation}") print(f"[info] Total training steps: {total_steps}") print(f"[info] Warmup steps: {warmup_steps}") print(f"[info] Learning rate: {learning_rate}") # Run name for WandB run_name = f"{dname}-bs{target_global_bs}-lr{learning_rate}-{args.width}-{args.tower}-{args.backbone}" # Training arguments training_args = TrainingArguments( output_dir=args.save_dir, run_name=run_name, # Training configuration per_device_train_batch_size=per_device_bs, gradient_accumulation_steps=gradient_accumulation, max_steps=total_steps, # Optimizer configuration learning_rate=learning_rate, weight_decay=0.01, max_grad_norm=4.0, # Learning rate scheduler optim="adamw_torch_fused", lr_scheduler_type="cosine", warmup_steps=warmup_steps, # Mixed precision - BF16 for A100 fp16=False, bf16=True, bf16_full_eval=True, # Logging and saving logging_steps=100, logging_first_step=True, save_strategy="steps", save_steps=5000, save_total_limit=3, # Evaluation eval_strategy="no", load_best_model_at_end=False, # Data loading - Performance optimization dataloader_num_workers=8, dataloader_pin_memory=True, remove_unused_columns=False, dataloader_drop_last=True, dataloader_prefetch_factor=4, dataloader_persistent_workers=True, # Distributed training - Critical configuration ddp_backend='nccl', ddp_find_unused_parameters=False, ddp_bucket_cap_mb=25, local_rank=local_rank, # Other performance optimizations seed=args.seed, report_to="wandb" if is_main_process() else "none", gradient_checkpointing=False, save_on_each_node=False, torch_compile=True, # Suspicious point 1 tf32=True, ddp_broadcast_buffers=False, ) collate_fn_to_use = collate_fn_laion if dname == 'laion' else collate_fn # Create Trainer trainer = Trainer( model=model, args=training_args, train_dataset=train_dataset, data_collator=lambda x: collate_fn_to_use(x, processor=processor) ) model.train() trainer.train(resume_from_checkpoint=args.resume if args.resume else None) if rank == 0: final_path = os.path.join(args.save_dir, args.save_name) trainer.save_model(final_path) print(f"[info] Training finished. Final model saved to: {final_path}") def main(): args = parse_args() train(args) if __name__ == "__main__": main() ``` </details> <details> <summary>Launch Script (Click to expand)</summary> ```bash #!/bin/bash # Find a free port safely find_free_port() { for i in {1..50}; do port=$(( (RANDOM % 40000) + 20000 )) # 20000-59999 if ! ss -ltn | awk '{print $4}' | grep -q ":$port$"; then echo $port return 0 fi done echo "No free port found after 50 tries" >&2 return 1 } # Environment variables export OMP_NUM_THREADS=1 export PYTHONWARNINGS="ignore::FutureWarning" export TORCHDYNAMO_CAPTURE_SCALAR_OUTPUTS=1 echo "Training with learning rate: $lr" MASTER_ADDR=$(hostname) FREE_PORT=$(find_free_port) if [ $? -ne 0 ]; then echo "Error: Could not find a free port. Exiting." exit 1 fi echo "Using rendezvous: ${MASTER_ADDR}:${FREE_PORT}" torchrun \ --nproc_per_node=8 \ --rdzv_backend=c10d \ --rdzv_endpoint=${MASTER_ADDR}:${FREE_PORT} \ clip_hf.py \ --num_workers 32 \ --save_name main_t-cc12m-mup-lr${lr}.pt \ --save_dir $SAVE_DIR \ --data_path $DATA_PATH ``` </details> ## Observed Symptoms ### 1. Abnormally Low GPU Memory Usage - All 8 GPUs show only ~10% memory allocation - This is unusually low for CLIP training with batch_size=128 - May indicate data is not being properly distributed to GPUs ### 2. Severe GPU Utilization Fluctuations - Frequent oscillation between 0-100% - Significant idle cycles present - GPUs don't appear to be working synchronously ### 3. Potential Bottleneck Points - **Data Loading**: Despite 8 workers + prefetch, may still have I/O bottleneck - **DDP Synchronization**: Gradient synchronization between GPUs may have waiting issues - **Torch Compile**: Potential compatibility issues with DDP or Flash Attention - **Gradient Accumulation**: Although set to 1, may have abnormal interactions with other components ## Questions Needing Help 1. **What is the root cause of this GPU utilization pattern?** - Is it a data loading bottleneck? - Is it a DDP communication issue? - Is it a torch.compile compatibility issue with DDP/FA2? 2. **How to diagnose the specific bottleneck?** - What profiling code should be added? - What metrics should be monitored? 3. **Suggested optimization directions?** - Should `torch_compile` be disabled? - Should we switch back to standard attention? - Do dataloader parameters need adjustment? ## Expected Behavior - GPU utilization should maintain a stable 80-95% level - Memory usage should be higher (at least 30-50%) - All GPUs should work synchronously without obvious idle cycles ## Additional Information - Dataset: CC12M / LAION - Model: CLIP (training from scratch) - Goal: Reproduce OpenAI CLIP-like training configuration --- **Monitoring Screenshots:** <img width="479" height="653" alt="Image" src="https://github.com/user-attachments/assets/5c9caa60-1bb7-4eac-b321-73659c72e3a9" /> ) - Top chart: GPU Memory Allocation (~10% across all GPUs) - Bottom chart: GPU Utilization (severe fluctuation 0-100%) ## Possible Related Issues - Interaction between `torch.compile` and DDP - Flash Attention 2 compatibility with distributed training - Dataloader performance in multi-GPU setup - NCCL communication overhead Any insights or suggestions would be greatly appreciated!
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41615/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41615/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41614
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41614/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41614/comments
https://api.github.com/repos/huggingface/transformers/issues/41614/events
https://github.com/huggingface/transformers/pull/41614
3,517,599,194
PR_kwDOCUB6oc6t3L39
41,614
[`Docs`] Fix changed references
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T11:29:13
2025-10-15T11:59:17
2025-10-15T11:59:13
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41614", "html_url": "https://github.com/huggingface/transformers/pull/41614", "diff_url": "https://github.com/huggingface/transformers/pull/41614.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41614.patch", "merged_at": "2025-10-15T11:59:13" }
#41445 moved `load_sharded_checkpoint` to the trainer utils instead of the modeling utils. This PR changes the remaining references and (hopefully) fixes the CI on main with the doc builder
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41614/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41614/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41613
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41613/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41613/comments
https://api.github.com/repos/huggingface/transformers/issues/41613/events
https://github.com/huggingface/transformers/pull/41613
3,517,586,889
PR_kwDOCUB6oc6t3JLR
41,613
Fix quantization base class
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T11:25:18
2025-10-15T14:58:19
2025-10-15T14:58:17
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41613", "html_url": "https://github.com/huggingface/transformers/pull/41613", "diff_url": "https://github.com/huggingface/transformers/pull/41613.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41613.patch", "merged_at": "2025-10-15T14:58:17" }
# What does this PR do? This PR fixes issues that was created from this [PR](https://github.com/huggingface/transformers/pull/41445) regarding quantization.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41613/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41613/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41612
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41612/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41612/comments
https://api.github.com/repos/huggingface/transformers/issues/41612/events
https://github.com/huggingface/transformers/pull/41612
3,517,439,279
PR_kwDOCUB6oc6t2ojq
41,612
Fix EncoderDecoder cache
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T10:39:31
2025-10-16T12:55:44
2025-10-16T12:55:42
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41612", "html_url": "https://github.com/huggingface/transformers/pull/41612", "diff_url": "https://github.com/huggingface/transformers/pull/41612.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41612.patch", "merged_at": "2025-10-16T12:55:42" }
In #41569 we restored thr `__iter__` method to `DynamicCache` but I missed the fact that it was also removed from `EncoderDecoderCache`. This PR fixes that and modifies the `__init__` of `EncoderDecoderCache` in the case of DDP, so it is compatible with the new system.
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41612/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41612/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41611
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41611/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41611/comments
https://api.github.com/repos/huggingface/transformers/issues/41611/events
https://github.com/huggingface/transformers/pull/41611
3,517,424,685
PR_kwDOCUB6oc6t2lQC
41,611
Docs add custom loss example
{ "login": "CodersAcademy006", "id": 104912634, "node_id": "U_kgDOBkDW-g", "avatar_url": "https://avatars.githubusercontent.com/u/104912634?v=4", "gravatar_id": "", "url": "https://api.github.com/users/CodersAcademy006", "html_url": "https://github.com/CodersAcademy006", "followers_url": "https://api.github.com/users/CodersAcademy006/followers", "following_url": "https://api.github.com/users/CodersAcademy006/following{/other_user}", "gists_url": "https://api.github.com/users/CodersAcademy006/gists{/gist_id}", "starred_url": "https://api.github.com/users/CodersAcademy006/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CodersAcademy006/subscriptions", "organizations_url": "https://api.github.com/users/CodersAcademy006/orgs", "repos_url": "https://api.github.com/users/CodersAcademy006/repos", "events_url": "https://api.github.com/users/CodersAcademy006/events{/privacy}", "received_events_url": "https://api.github.com/users/CodersAcademy006/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T10:36:09
2025-10-15T16:23:12
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41611", "html_url": "https://github.com/huggingface/transformers/pull/41611", "diff_url": "https://github.com/huggingface/transformers/pull/41611.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41611.patch", "merged_at": null }
This PR adds a new section to the `Trainer` documentation explaining the best practice for implementing a custom loss function. It provides a clear code example that shows how to subclass `Trainer` and override the `compute_loss` method. This addresses a common user question and makes the documentation more complete. Fixes #41247
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41611/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41611/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41610
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41610/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41610/comments
https://api.github.com/repos/huggingface/transformers/issues/41610/events
https://github.com/huggingface/transformers/pull/41610
3,517,414,676
PR_kwDOCUB6oc6t2i_J
41,610
torch 2.9 don't ❤️ torchcodec 💔
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T10:34:01
2025-10-17T08:46:17
2025-10-15T12:34:00
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41610", "html_url": "https://github.com/huggingface/transformers/pull/41610", "diff_url": "https://github.com/huggingface/transformers/pull/41610.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41610.patch", "merged_at": "2025-10-15T12:34:00" }
# What does this PR do? Let's pin torch 2.8 for now. See https://github.com/meta-pytorch/torchcodec/issues/912 It is causing problems on our CircleCI for now.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41610/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41610/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41609
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41609/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41609/comments
https://api.github.com/repos/huggingface/transformers/issues/41609/events
https://github.com/huggingface/transformers/pull/41609
3,517,379,985
PR_kwDOCUB6oc6t2bTU
41,609
Fix gemma gguf tokenizer
{ "login": "CodersAcademy006", "id": 104912634, "node_id": "U_kgDOBkDW-g", "avatar_url": "https://avatars.githubusercontent.com/u/104912634?v=4", "gravatar_id": "", "url": "https://api.github.com/users/CodersAcademy006", "html_url": "https://github.com/CodersAcademy006", "followers_url": "https://api.github.com/users/CodersAcademy006/followers", "following_url": "https://api.github.com/users/CodersAcademy006/following{/other_user}", "gists_url": "https://api.github.com/users/CodersAcademy006/gists{/gist_id}", "starred_url": "https://api.github.com/users/CodersAcademy006/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CodersAcademy006/subscriptions", "organizations_url": "https://api.github.com/users/CodersAcademy006/orgs", "repos_url": "https://api.github.com/users/CodersAcademy006/repos", "events_url": "https://api.github.com/users/CodersAcademy006/events{/privacy}", "received_events_url": "https://api.github.com/users/CodersAcademy006/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T10:25:44
2025-10-18T09:31:13
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41609", "html_url": "https://github.com/huggingface/transformers/pull/41609", "diff_url": "https://github.com/huggingface/transformers/pull/41609.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41609.patch", "merged_at": null }
This PR fixes an issue where `AutoTokenizer.from_pretrained` would instantiate an incorrect `Unigram` tokenizer when loading from a Gemma GGUF file, instead of the correct `BPE`-based `GemmaTokenizer`. The root cause was that the GGUF loading logic did not have an explicit check to handle the Gemma architecture specifically. This fix introduces a check that: 1. Detects if a `gguf_file` is being loaded. 2. Reads the GGUF metadata to confirm the architecture is `gemma` and the tokenizer model is `bpe`. 3. Overrides the `tokenizer_class` in the loaded configuration to `GemmaTokenizer`, forcing the rest of the function to instantiate the correct class. This ensures that loading a tokenizer from a Gemma GGUF file produces the same tokenization as loading from the original Hugging Face repository. **Fixes #41494**
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41609/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41609/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41608
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41608/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41608/comments
https://api.github.com/repos/huggingface/transformers/issues/41608/events
https://github.com/huggingface/transformers/pull/41608
3,517,318,335
PR_kwDOCUB6oc6t2Nmn
41,608
Import `expand_device_map` instead of redefining it
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T10:10:07
2025-10-15T12:00:11
2025-10-15T12:00:10
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41608", "html_url": "https://github.com/huggingface/transformers/pull/41608", "diff_url": "https://github.com/huggingface/transformers/pull/41608.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41608.patch", "merged_at": "2025-10-15T12:00:10" }
# What does this PR do? This function has been moved to integrations but was not deleted properly
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41608/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41608/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41607
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41607/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41607/comments
https://api.github.com/repos/huggingface/transformers/issues/41607/events
https://github.com/huggingface/transformers/pull/41607
3,517,268,838
PR_kwDOCUB6oc6t2CoO
41,607
[v5] Delete `videos` from image processing classes
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T09:56:46
2025-10-21T10:03:31
2025-10-21T10:03:31
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41607", "html_url": "https://github.com/huggingface/transformers/pull/41607", "diff_url": "https://github.com/huggingface/transformers/pull/41607.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41607.patch", "merged_at": "2025-10-21T10:03:31" }
# What does this PR do? As per title, it was deprecated for v5
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41607/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41607/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41606
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41606/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41606/comments
https://api.github.com/repos/huggingface/transformers/issues/41606/events
https://github.com/huggingface/transformers/pull/41606
3,517,230,458
PR_kwDOCUB6oc6t16MT
41,606
fix(processing): Filter kwargs in ProcessorMixin call to prevent Type…
{ "login": "CodersAcademy006", "id": 104912634, "node_id": "U_kgDOBkDW-g", "avatar_url": "https://avatars.githubusercontent.com/u/104912634?v=4", "gravatar_id": "", "url": "https://api.github.com/users/CodersAcademy006", "html_url": "https://github.com/CodersAcademy006", "followers_url": "https://api.github.com/users/CodersAcademy006/followers", "following_url": "https://api.github.com/users/CodersAcademy006/following{/other_user}", "gists_url": "https://api.github.com/users/CodersAcademy006/gists{/gist_id}", "starred_url": "https://api.github.com/users/CodersAcademy006/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CodersAcademy006/subscriptions", "organizations_url": "https://api.github.com/users/CodersAcademy006/orgs", "repos_url": "https://api.github.com/users/CodersAcademy006/repos", "events_url": "https://api.github.com/users/CodersAcademy006/events{/privacy}", "received_events_url": "https://api.github.com/users/CodersAcademy006/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T09:44:53
2025-10-24T11:25:53
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41606", "html_url": "https://github.com/huggingface/transformers/pull/41606", "diff_url": "https://github.com/huggingface/transformers/pull/41606.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41606.patch", "merged_at": null }
This PR fixes an issue where multimodal processors could raise a `TypeError` when called with arguments specific to one modality. The `ProcessorMixin.__call__` method was passing all keyword arguments to each sub-processor (tokenizer, feature extractor, etc.), causing a crash if a processor received an argument it didn't recognize (e.g., `EncodecFeatureExtractor` receiving `pad_to_multiple_of` from the tokenizer). This fix resolves the problem by filtering the keyword arguments before they are passed to each sub-processor. It uses Python's `inspect` module to get the valid parameters for each processor's `__call__` method and only passes the arguments that match. This prevents `TypeError`s and makes the processor logic more robust. **Fixes #41598**
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41606/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41606/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41605
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41605/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41605/comments
https://api.github.com/repos/huggingface/transformers/issues/41605/events
https://github.com/huggingface/transformers/pull/41605
3,517,229,429
PR_kwDOCUB6oc6t15-Q
41,605
Fix fp32_ln for various models
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T09:44:32
2025-10-16T12:18:51
2025-10-16T12:18:50
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41605", "html_url": "https://github.com/huggingface/transformers/pull/41605", "diff_url": "https://github.com/huggingface/transformers/pull/41605.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41605.patch", "merged_at": "2025-10-16T12:18:50" }
This PR fixes the test `test_flash_attn_2_fp32_ln` for several models: - `bark` was failing the test because it call `_flash_attention_forward` directly without checking the `queries` dtype, and so the test could fail if the dtype was `torch.float32`. To fix this we re-factored out a code block into a function `get_target_dtype` that takes care of infering whether to cast the fp32 tesnor to fp16 or bf16, and added a called to it before the call to FA - same for `stablelm` - `mllama` was failing the test because `MllamaTextSelfAttention` lacks the `is_causal`attribute, which was added and set to True (it's a text attention so it's causal, as discussed in #39182) - same for `kosmos2` but the test still fails for many many other reasons The list of fixed test is here: ``` FAILED tests/models/bark/test_modeling_bark.py::BarkSemanticModelTest::test_flash_attn_2_fp32_ln - RuntimeError: FlashAttention only support fp16 and bf16 data type FAILED tests/models/bark/test_modeling_bark.py::BarkCoarseModelTest::test_flash_attn_2_fp32_ln - RuntimeError: FlashAttention only support fp16 and bf16 data type FAILED tests/models/mllama/test_modeling_mllama.py::MllamaForCausalLMModelTest::test_flash_attn_2_fp32_ln - AttributeError: 'MllamaTextSelfAttention' object has no attribute 'is_causal' FAILED tests/models/mllama/test_modeling_mllama.py::MllamaForConditionalGenerationModelTest::test_flash_attn_2_fp32_ln - AttributeError: 'MllamaTextSelfAttention' object has no attribute 'is_causal' FAILED tests/models/stablelm/test_modeling_stablelm.py::StableLmModelTest::test_flash_attn_2_fp32_ln - RuntimeError: FlashAttention only support fp16 and bf16 data type ```
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41605/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41605/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41604
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41604/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41604/comments
https://api.github.com/repos/huggingface/transformers/issues/41604/events
https://github.com/huggingface/transformers/pull/41604
3,517,214,615
PR_kwDOCUB6oc6t12rw
41,604
Fix TypeError: find_adapter_config_file() got an unexpected keyword argument '_adapter_model_path'
{ "login": "albertvillanova", "id": 8515462, "node_id": "MDQ6VXNlcjg1MTU0NjI=", "avatar_url": "https://avatars.githubusercontent.com/u/8515462?v=4", "gravatar_id": "", "url": "https://api.github.com/users/albertvillanova", "html_url": "https://github.com/albertvillanova", "followers_url": "https://api.github.com/users/albertvillanova/followers", "following_url": "https://api.github.com/users/albertvillanova/following{/other_user}", "gists_url": "https://api.github.com/users/albertvillanova/gists{/gist_id}", "starred_url": "https://api.github.com/users/albertvillanova/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/albertvillanova/subscriptions", "organizations_url": "https://api.github.com/users/albertvillanova/orgs", "repos_url": "https://api.github.com/users/albertvillanova/repos", "events_url": "https://api.github.com/users/albertvillanova/events{/privacy}", "received_events_url": "https://api.github.com/users/albertvillanova/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T09:40:44
2025-10-25T07:26:38
2025-10-24T17:52:15
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41604", "html_url": "https://github.com/huggingface/transformers/pull/41604", "diff_url": "https://github.com/huggingface/transformers/pull/41604.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41604.patch", "merged_at": "2025-10-24T17:52:15" }
Fix TypeError: find_adapter_config_file() got an unexpected keyword argument '_adapter_model_path' - Pass original dict instead of copy to `maybe_load_adapters` The `adapter_kwargs` is passed to `maybe_load_adapters` as unpacked keyword arguments: Python does not pass the original dictionary, but instead creates a new dictionary inside the function; so the `pop` inside the function does not have the intended behavior. https://github.com/huggingface/transformers/blob/5db730786d87bde8397a207bd84ae0014054fc37/src/transformers/modeling_utils.py#L4383-L4387 ```python def maybe_load_adapters( pretrained_model_name_or_path, download_kwargs: DownloadKwargs, **adapter_kwargs, ): ... _adapter_model_path = adapter_kwargs.pop("_adapter_model_path", None) ``` This issue was introduced by PR: - #41445 Related downstream issue: - https://github.com/huggingface/trl/issues/4273
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41604/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41604/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41603
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41603/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41603/comments
https://api.github.com/repos/huggingface/transformers/issues/41603/events
https://github.com/huggingface/transformers/pull/41603
3,517,096,958
PR_kwDOCUB6oc6t1c_F
41,603
Fix video processing channel format
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T09:08:58
2025-10-15T14:02:31
2025-10-15T13:48:01
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41603", "html_url": "https://github.com/huggingface/transformers/pull/41603", "diff_url": "https://github.com/huggingface/transformers/pull/41603.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41603.patch", "merged_at": "2025-10-15T13:48:01" }
# What does this PR do? FIxes failing VJEPA test. The test was failing because the video was not permuted to the correct format that torchvision expects `(T, C, H, W)`
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41603/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41603/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41601
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41601/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41601/comments
https://api.github.com/repos/huggingface/transformers/issues/41601/events
https://github.com/huggingface/transformers/pull/41601
3,516,659,205
PR_kwDOCUB6oc6tz856
41,601
enable sdpa enable gqa logic for Ascend NPU
{ "login": "FightingZhen", "id": 26176607, "node_id": "MDQ6VXNlcjI2MTc2NjA3", "avatar_url": "https://avatars.githubusercontent.com/u/26176607?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FightingZhen", "html_url": "https://github.com/FightingZhen", "followers_url": "https://api.github.com/users/FightingZhen/followers", "following_url": "https://api.github.com/users/FightingZhen/following{/other_user}", "gists_url": "https://api.github.com/users/FightingZhen/gists{/gist_id}", "starred_url": "https://api.github.com/users/FightingZhen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FightingZhen/subscriptions", "organizations_url": "https://api.github.com/users/FightingZhen/orgs", "repos_url": "https://api.github.com/users/FightingZhen/repos", "events_url": "https://api.github.com/users/FightingZhen/events{/privacy}", "received_events_url": "https://api.github.com/users/FightingZhen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T07:08:01
2025-10-15T13:46:30
2025-10-15T13:45:29
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41601", "html_url": "https://github.com/huggingface/transformers/pull/41601", "diff_url": "https://github.com/huggingface/transformers/pull/41601.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41601.patch", "merged_at": "2025-10-15T13:45:29" }
# What does this PR do? As discussion [(link)](https://github.com/huggingface/transformers/pull/41143#issuecomment-3346569299) in #41143 , we set `enable_gqa=False` by force when running on Ascend NPU. After that, we made a series of experiments to make sure `torch.nn.functional.scaled_dot_product_attention` api work fine on Ascend NPU when setting `enable_gqa` to `True` or `False`, so this limitation can be removed now :) Note: Experiments based on `torch==2.5.1` and `torch_npu==2.5.1` Fixes # (issue) Not related. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests?
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41601/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41601/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41600
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41600/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41600/comments
https://api.github.com/repos/huggingface/transformers/issues/41600/events
https://github.com/huggingface/transformers/issues/41600
3,516,602,859
I_kwDOCUB6oc7Rmxnr
41,600
torch can not be detected by transformers on aarch
{ "login": "HuangChiEn", "id": 52521165, "node_id": "MDQ6VXNlcjUyNTIxMTY1", "avatar_url": "https://avatars.githubusercontent.com/u/52521165?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HuangChiEn", "html_url": "https://github.com/HuangChiEn", "followers_url": "https://api.github.com/users/HuangChiEn/followers", "following_url": "https://api.github.com/users/HuangChiEn/following{/other_user}", "gists_url": "https://api.github.com/users/HuangChiEn/gists{/gist_id}", "starred_url": "https://api.github.com/users/HuangChiEn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HuangChiEn/subscriptions", "organizations_url": "https://api.github.com/users/HuangChiEn/orgs", "repos_url": "https://api.github.com/users/HuangChiEn/repos", "events_url": "https://api.github.com/users/HuangChiEn/events{/privacy}", "received_events_url": "https://api.github.com/users/HuangChiEn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-15T06:49:29
2025-10-16T03:20:53
2025-10-16T03:20:53
NONE
null
null
null
null
### System Info transformers 4.55.0 python 3.10.12 plateform aarch ### Who can help? @ivarflakstad i know it's different between AMD and arm, unfortunately i can not image who can help me in this issue. ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ngc torch container : `nvcr.io/nvidia/pytorch:24.01-py3` `pip install transformers==4.55.0` `python -c "from transformers import AutoModel"` error : <img width="1282" height="777" alt="Image" src="https://github.com/user-attachments/assets/ff70e166-afe3-49a9-b8b1-bf41123aa496" /> i had search for this error, someone said it's because you didn't install torch. However, i had checked the NGC container does contain torch v2.2.0 as follows. <img width="457" height="157" alt="Image" src="https://github.com/user-attachments/assets/32f1b3e5-ae57-4af3-9c64-d4134148db88" /> ### Expected behavior successfully import AutoModel.
{ "login": "HuangChiEn", "id": 52521165, "node_id": "MDQ6VXNlcjUyNTIxMTY1", "avatar_url": "https://avatars.githubusercontent.com/u/52521165?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HuangChiEn", "html_url": "https://github.com/HuangChiEn", "followers_url": "https://api.github.com/users/HuangChiEn/followers", "following_url": "https://api.github.com/users/HuangChiEn/following{/other_user}", "gists_url": "https://api.github.com/users/HuangChiEn/gists{/gist_id}", "starred_url": "https://api.github.com/users/HuangChiEn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HuangChiEn/subscriptions", "organizations_url": "https://api.github.com/users/HuangChiEn/orgs", "repos_url": "https://api.github.com/users/HuangChiEn/repos", "events_url": "https://api.github.com/users/HuangChiEn/events{/privacy}", "received_events_url": "https://api.github.com/users/HuangChiEn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41600/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41600/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41599
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41599/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41599/comments
https://api.github.com/repos/huggingface/transformers/issues/41599/events
https://github.com/huggingface/transformers/pull/41599
3,516,143,002
PR_kwDOCUB6oc6tyPhY
41,599
More markdown file fixes
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T02:53:00
2025-10-15T13:59:29
2025-10-15T12:29:28
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41599", "html_url": "https://github.com/huggingface/transformers/pull/41599", "diff_url": "https://github.com/huggingface/transformers/pull/41599.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41599.patch", "merged_at": "2025-10-15T12:29:28" }
# What does this PR do? This PR applies markdownlint-cli2 to format and fix markdown files. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41599/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41599/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41598
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41598/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41598/comments
https://api.github.com/repos/huggingface/transformers/issues/41598/events
https://github.com/huggingface/transformers/issues/41598
3,516,099,075
I_kwDOCUB6oc7Rk2oD
41,598
EncodecFeatureExtractor.__call__() got an unexpected keyword argument 'pad_to_multiple_of'
{ "login": "auli-aziz", "id": 109910388, "node_id": "U_kgDOBo0ZdA", "avatar_url": "https://avatars.githubusercontent.com/u/109910388?v=4", "gravatar_id": "", "url": "https://api.github.com/users/auli-aziz", "html_url": "https://github.com/auli-aziz", "followers_url": "https://api.github.com/users/auli-aziz/followers", "following_url": "https://api.github.com/users/auli-aziz/following{/other_user}", "gists_url": "https://api.github.com/users/auli-aziz/gists{/gist_id}", "starred_url": "https://api.github.com/users/auli-aziz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/auli-aziz/subscriptions", "organizations_url": "https://api.github.com/users/auli-aziz/orgs", "repos_url": "https://api.github.com/users/auli-aziz/repos", "events_url": "https://api.github.com/users/auli-aziz/events{/privacy}", "received_events_url": "https://api.github.com/users/auli-aziz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-15T02:23:50
2025-10-24T11:09:29
null
NONE
null
null
null
null
### System Info - `transformers` version: 4.57.0 - Platform: Windows-10-10.0.26200-SP0 - Python version: 3.10.18 - Huggingface_hub version: 0.35.3 - Safetensors version: 0.5.3 - Accelerate version: 1.10.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.4.0+cpu (NA) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: no ### Who can help? Hello @eustlb @ebezzam @vasqu Just want to report a suspected bug. I keep getting an error when i try to inference the CSM-1B model using transformers with context. The error that i got is `Error with generated segments: EncodecFeatureExtractor.__call__() got an unexpected keyword argument 'pad_to_multiple_of'`. Perhaps the issue is particularly with how the audio is stored (doesn't give error when i apply_chat_template using one text input). I tried many things to store the audio such as storing it in Torch.tensor format and referencing it from a save wav file, but it still failed. Here is how i saved the generated_segments: ``` audio_file = f"temp_vad_recording_{int(time.time())}.wav" audio_tensor = torch.tensor(audio_data, dtype=torch.float32).unsqueeze(0) torchaudio.save(audio_file, audio_tensor, sample_rate) user_input = self.transcribe_audio(audio_file) self.generated_segments.append( { "role": "1", "content": [{"type": "text", "text": user_input}, {"type": "audio", "path": audio_file}], } ) ``` Here is how i applied chat template: ``` inputs = self.tts_processor.apply_chat_template( self.generated_segments, tokenize=True, return_dict=True, ).to(self.tts_generator.device) ``` ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction 1. run the code using a vad and whisper for transcribing into text: ``` audio_file = f"temp_vad_recording_{int(time.time())}.wav" audio_tensor = torch.tensor(audio_data, dtype=torch.float32).unsqueeze(0) torchaudio.save(audio_file, audio_tensor, sample_rate) user_input = self.transcribe_audio(audio_file) self.generated_segments.append( { "role": "1", "content": [{"type": "text", "text": user_input}, {"type": "audio", "path": audio_file}], } ) ``` ### Expected behavior apply_chat_template should work when referencing local audio files in the path, like mentioned in the the error `Incorrect format used for `audio`. Should be an url linking to an audio, a local path, or numpy array.`
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41598/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41598/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41597
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41597/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41597/comments
https://api.github.com/repos/huggingface/transformers/issues/41597/events
https://github.com/huggingface/transformers/pull/41597
3,516,084,194
PR_kwDOCUB6oc6tyDWr
41,597
Standardize RoBERTa model card following issue #36979
{ "login": "MithraVardhan", "id": 156972088, "node_id": "U_kgDOCVs0OA", "avatar_url": "https://avatars.githubusercontent.com/u/156972088?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MithraVardhan", "html_url": "https://github.com/MithraVardhan", "followers_url": "https://api.github.com/users/MithraVardhan/followers", "following_url": "https://api.github.com/users/MithraVardhan/following{/other_user}", "gists_url": "https://api.github.com/users/MithraVardhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/MithraVardhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MithraVardhan/subscriptions", "organizations_url": "https://api.github.com/users/MithraVardhan/orgs", "repos_url": "https://api.github.com/users/MithraVardhan/repos", "events_url": "https://api.github.com/users/MithraVardhan/events{/privacy}", "received_events_url": "https://api.github.com/users/MithraVardhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-15T02:12:11
2025-10-15T03:34:14
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41597", "html_url": "https://github.com/huggingface/transformers/pull/41597", "diff_url": "https://github.com/huggingface/transformers/pull/41597.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41597.patch", "merged_at": null }
**What does this PR do?** This PR standardizes the RoBERTa model card following the format established in issue #36979, making it more accessible and user-friendly. **Changes made:** ✅ **Enhanced Badge Support** - Added TensorFlow support (orange badge) - Added Flax support (yellow badge) - Maintained PyTorch and SDPA badges ✅ **Conversational Description** - Rewrote in beginner-friendly tone: "RoBERTa is like BERT's smarter cousin" - Explained key differences in simple terms - Highlighted practical benefits for sentiment analysis and text classification ✅ **Practical Usage Examples** - Added sentiment analysis examples with `cardiffnlp/twitter-roberta-base-sentiment-latest` - Complete AutoModel workflow with confidence scores - CLI example for command-line usage - All examples are functional and tested ✅ **Contributor Attribution** - Added tip box crediting original contributor [Joao Gante](https://huggingface.co/joaogante) ✅ **Comprehensive Resources Section** - Original paper link (1907.11692) - Official Facebook AI implementation - Hugging Face blog posts and guides - Training documentation links ✅ **Enhanced Notes Section** - RoBERTa-specific technical details - Dynamic masking explanation - Byte-level BPE tokenizer benefits **Before submitting:** - [x] I have read the [contributing guidelines](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md) - [x] I have tested the code examples for syntax errors - [x] I have verified all links are valid - [x] I have maintained all existing AutoClass documentation - [x] I have followed the conversational tone guidelines **References:** - Follows the same pattern as PR #37261 (T5), #37585 (SigLIP), #37063 (ELECTRA) - Addresses issue #36979 @stevhliu for review
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41597/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41597/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41596
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41596/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41596/comments
https://api.github.com/repos/huggingface/transformers/issues/41596/events
https://github.com/huggingface/transformers/pull/41596
3,516,076,631
PR_kwDOCUB6oc6tyB0f
41,596
Standardize roberta model card
{ "login": "MithraVardhan", "id": 156972088, "node_id": "U_kgDOCVs0OA", "avatar_url": "https://avatars.githubusercontent.com/u/156972088?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MithraVardhan", "html_url": "https://github.com/MithraVardhan", "followers_url": "https://api.github.com/users/MithraVardhan/followers", "following_url": "https://api.github.com/users/MithraVardhan/following{/other_user}", "gists_url": "https://api.github.com/users/MithraVardhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/MithraVardhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MithraVardhan/subscriptions", "organizations_url": "https://api.github.com/users/MithraVardhan/orgs", "repos_url": "https://api.github.com/users/MithraVardhan/repos", "events_url": "https://api.github.com/users/MithraVardhan/events{/privacy}", "received_events_url": "https://api.github.com/users/MithraVardhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T02:06:29
2025-10-15T02:06:51
2025-10-15T02:06:51
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41596", "html_url": "https://github.com/huggingface/transformers/pull/41596", "diff_url": "https://github.com/huggingface/transformers/pull/41596.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41596.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "MithraVardhan", "id": 156972088, "node_id": "U_kgDOCVs0OA", "avatar_url": "https://avatars.githubusercontent.com/u/156972088?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MithraVardhan", "html_url": "https://github.com/MithraVardhan", "followers_url": "https://api.github.com/users/MithraVardhan/followers", "following_url": "https://api.github.com/users/MithraVardhan/following{/other_user}", "gists_url": "https://api.github.com/users/MithraVardhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/MithraVardhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MithraVardhan/subscriptions", "organizations_url": "https://api.github.com/users/MithraVardhan/orgs", "repos_url": "https://api.github.com/users/MithraVardhan/repos", "events_url": "https://api.github.com/users/MithraVardhan/events{/privacy}", "received_events_url": "https://api.github.com/users/MithraVardhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41596/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41596/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41595
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41595/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41595/comments
https://api.github.com/repos/huggingface/transformers/issues/41595/events
https://github.com/huggingface/transformers/pull/41595
3,516,065,887
PR_kwDOCUB6oc6tx_nk
41,595
Standardize roberta model card
{ "login": "MithraVardhan", "id": 156972088, "node_id": "U_kgDOCVs0OA", "avatar_url": "https://avatars.githubusercontent.com/u/156972088?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MithraVardhan", "html_url": "https://github.com/MithraVardhan", "followers_url": "https://api.github.com/users/MithraVardhan/followers", "following_url": "https://api.github.com/users/MithraVardhan/following{/other_user}", "gists_url": "https://api.github.com/users/MithraVardhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/MithraVardhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MithraVardhan/subscriptions", "organizations_url": "https://api.github.com/users/MithraVardhan/orgs", "repos_url": "https://api.github.com/users/MithraVardhan/repos", "events_url": "https://api.github.com/users/MithraVardhan/events{/privacy}", "received_events_url": "https://api.github.com/users/MithraVardhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-15T01:58:47
2025-10-15T01:59:57
2025-10-15T01:59:57
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41595", "html_url": "https://github.com/huggingface/transformers/pull/41595", "diff_url": "https://github.com/huggingface/transformers/pull/41595.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41595.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "MithraVardhan", "id": 156972088, "node_id": "U_kgDOCVs0OA", "avatar_url": "https://avatars.githubusercontent.com/u/156972088?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MithraVardhan", "html_url": "https://github.com/MithraVardhan", "followers_url": "https://api.github.com/users/MithraVardhan/followers", "following_url": "https://api.github.com/users/MithraVardhan/following{/other_user}", "gists_url": "https://api.github.com/users/MithraVardhan/gists{/gist_id}", "starred_url": "https://api.github.com/users/MithraVardhan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MithraVardhan/subscriptions", "organizations_url": "https://api.github.com/users/MithraVardhan/orgs", "repos_url": "https://api.github.com/users/MithraVardhan/repos", "events_url": "https://api.github.com/users/MithraVardhan/events{/privacy}", "received_events_url": "https://api.github.com/users/MithraVardhan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41595/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41595/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41594
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41594/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41594/comments
https://api.github.com/repos/huggingface/transformers/issues/41594/events
https://github.com/huggingface/transformers/pull/41594
3,515,715,374
PR_kwDOCUB6oc6tw1PT
41,594
Add beginner-friendly sentiment analysis example
{ "login": "PrajwalDambalkar", "id": 48172011, "node_id": "MDQ6VXNlcjQ4MTcyMDEx", "avatar_url": "https://avatars.githubusercontent.com/u/48172011?v=4", "gravatar_id": "", "url": "https://api.github.com/users/PrajwalDambalkar", "html_url": "https://github.com/PrajwalDambalkar", "followers_url": "https://api.github.com/users/PrajwalDambalkar/followers", "following_url": "https://api.github.com/users/PrajwalDambalkar/following{/other_user}", "gists_url": "https://api.github.com/users/PrajwalDambalkar/gists{/gist_id}", "starred_url": "https://api.github.com/users/PrajwalDambalkar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/PrajwalDambalkar/subscriptions", "organizations_url": "https://api.github.com/users/PrajwalDambalkar/orgs", "repos_url": "https://api.github.com/users/PrajwalDambalkar/repos", "events_url": "https://api.github.com/users/PrajwalDambalkar/events{/privacy}", "received_events_url": "https://api.github.com/users/PrajwalDambalkar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T22:31:06
2025-10-14T22:31:06
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41594", "html_url": "https://github.com/huggingface/transformers/pull/41594", "diff_url": "https://github.com/huggingface/transformers/pull/41594.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41594.patch", "merged_at": null }
## What does this PR do? This PR adds a beginner-friendly sentiment analysis example to help newcomers learn the transformers library. The new example provides a simpler, more educational alternative to the existing `run_glue.py` and `run_classification.py` scripts. ## Motivation and Context The current text-classification examples (`run_glue.py`, `run_classification.py`) are comprehensive but can be overwhelming for beginners. This educational example fills the gap by: - **Simplifying the learning curve**: Clear step-by-step workflow with emoji markers - **Educational focus**: Extensive comments and docstrings explaining each step - **Quick experimentation**: Options to limit dataset size for faster testing - **Complete workflow**: From data loading to predictions in one simple script ## Changes Made ### New Files - **`run_simple_sentiment.py`**: Educational sentiment analysis example using DistilBERT on IMDB dataset - Command-line arguments for easy customization - Comprehensive logging at each step - Example predictions with confidence scores - ~300 lines with extensive documentation - **`test_simple_sentiment.py`**: Unit tests covering: - Tokenizer and model loading - Preprocessing function - Model inference - Metrics computation - All 5 tests pass ### Modified Files - **`README.md`**: Added "Simple Sentiment Analysis (Beginner-Friendly)" section at the top with: - Quick start examples - Usage instructions - Rationale for the example ## Testing ```bash # All unit tests pass python examples/pytorch/text-classification/test_simple_sentiment.py # Result: 5/5 tests passed # Tested with small dataset python run_simple_sentiment.py --max_train_samples 500 --max_eval_samples 100 --num_train_epochs 1 # Result: Training and evaluation completed successfully ``` ## Target Audience This example is designed for: - Data scientists new to transformers - Students learning NLP with deep learning - Developers wanting a quick introduction before diving into production scripts ## Example Usage ```bash # Basic usage python run_simple_sentiment.py # Quick demo (faster) python run_simple_sentiment.py --max_train_samples 1000 --max_eval_samples 200 # Custom model python run_simple_sentiment.py --model_name_or_path bert-base-uncased ``` ## Checklist - [x] Added new beginner-friendly example - [x] Comprehensive unit tests (5/5 passing) - [x] Updated README.md with documentation - [x] Followed transformers coding style - [x] Clear comments and docstrings - [x] Tested successfully with small dataset - [x] Educational value for newcomers ## Additional Notes This contribution was created as part of an open-source contribution exercise, focusing on adding educational value to the transformers library for newcomers.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41594/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41594/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41593
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41593/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41593/comments
https://api.github.com/repos/huggingface/transformers/issues/41593/events
https://github.com/huggingface/transformers/pull/41593
3,515,674,328
PR_kwDOCUB6oc6twsUj
41,593
examples: add multi-label text classification (BCEWithLogitsLoss, met…
{ "login": "aryaMehta26", "id": 91967116, "node_id": "U_kgDOBXtOjA", "avatar_url": "https://avatars.githubusercontent.com/u/91967116?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aryaMehta26", "html_url": "https://github.com/aryaMehta26", "followers_url": "https://api.github.com/users/aryaMehta26/followers", "following_url": "https://api.github.com/users/aryaMehta26/following{/other_user}", "gists_url": "https://api.github.com/users/aryaMehta26/gists{/gist_id}", "starred_url": "https://api.github.com/users/aryaMehta26/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/aryaMehta26/subscriptions", "organizations_url": "https://api.github.com/users/aryaMehta26/orgs", "repos_url": "https://api.github.com/users/aryaMehta26/repos", "events_url": "https://api.github.com/users/aryaMehta26/events{/privacy}", "received_events_url": "https://api.github.com/users/aryaMehta26/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T22:12:00
2025-10-15T14:04:17
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41593", "html_url": "https://github.com/huggingface/transformers/pull/41593", "diff_url": "https://github.com/huggingface/transformers/pull/41593.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41593.patch", "merged_at": null }
…rics, threshold tuning) + README usage # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41593/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41593/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41592
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41592/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41592/comments
https://api.github.com/repos/huggingface/transformers/issues/41592/events
https://github.com/huggingface/transformers/pull/41592
3,515,372,709
PR_kwDOCUB6oc6tvqBi
41,592
Improve AutoTokenizer error message for Voxtral models missing mistral-common
{ "login": "Khansa435", "id": 194109523, "node_id": "U_kgDOC5HgUw", "avatar_url": "https://avatars.githubusercontent.com/u/194109523?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Khansa435", "html_url": "https://github.com/Khansa435", "followers_url": "https://api.github.com/users/Khansa435/followers", "following_url": "https://api.github.com/users/Khansa435/following{/other_user}", "gists_url": "https://api.github.com/users/Khansa435/gists{/gist_id}", "starred_url": "https://api.github.com/users/Khansa435/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Khansa435/subscriptions", "organizations_url": "https://api.github.com/users/Khansa435/orgs", "repos_url": "https://api.github.com/users/Khansa435/repos", "events_url": "https://api.github.com/users/Khansa435/events{/privacy}", "received_events_url": "https://api.github.com/users/Khansa435/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T20:23:12
2025-10-29T20:08:20
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41592", "html_url": "https://github.com/huggingface/transformers/pull/41592", "diff_url": "https://github.com/huggingface/transformers/pull/41592.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41592.patch", "merged_at": null }
## What does this PR do? Adds a clearer ImportError message when users try to load a Voxtral tokenizer without having `mistral-common` installed. ## Issue Fixes #41553 Fixes misleading `TypeError: not a string` when loading Voxtral models. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @ArthurZucker @Rocketknight1 Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41592/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41592/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41591
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41591/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41591/comments
https://api.github.com/repos/huggingface/transformers/issues/41591/events
https://github.com/huggingface/transformers/pull/41591
3,515,045,590
PR_kwDOCUB6oc6tujCj
41,591
[docs] Duplicate entry
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T18:34:50
2025-10-15T15:02:39
2025-10-15T15:02:37
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41591", "html_url": "https://github.com/huggingface/transformers/pull/41591", "diff_url": "https://github.com/huggingface/transformers/pull/41591.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41591.patch", "merged_at": "2025-10-15T15:02:37" }
Fixes duplicate entry in the `toctree` that may be breaking the `custom-doc-build` job for translations (https://github.com/huggingface/transformers/actions/runs/18381182160/job/52725830409)
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41591/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41591/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41590
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41590/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41590/comments
https://api.github.com/repos/huggingface/transformers/issues/41590/events
https://github.com/huggingface/transformers/pull/41590
3,514,717,374
PR_kwDOCUB6oc6ttcUb
41,590
Add Backbone API fine-tuning tutorial
{ "login": "merveenoyan", "id": 53175384, "node_id": "MDQ6VXNlcjUzMTc1Mzg0", "avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4", "gravatar_id": "", "url": "https://api.github.com/users/merveenoyan", "html_url": "https://github.com/merveenoyan", "followers_url": "https://api.github.com/users/merveenoyan/followers", "following_url": "https://api.github.com/users/merveenoyan/following{/other_user}", "gists_url": "https://api.github.com/users/merveenoyan/gists{/gist_id}", "starred_url": "https://api.github.com/users/merveenoyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/merveenoyan/subscriptions", "organizations_url": "https://api.github.com/users/merveenoyan/orgs", "repos_url": "https://api.github.com/users/merveenoyan/repos", "events_url": "https://api.github.com/users/merveenoyan/events{/privacy}", "received_events_url": "https://api.github.com/users/merveenoyan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T16:42:34
2025-10-15T16:42:35
2025-10-15T16:42:33
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41590", "html_url": "https://github.com/huggingface/transformers/pull/41590", "diff_url": "https://github.com/huggingface/transformers/pull/41590.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41590.patch", "merged_at": "2025-10-15T16:42:33" }
We have recently merged this PR to revive the backbone API using DINOv3 as opportunity to do so https://github.com/huggingface/transformers/pull/40651 So I have written documentation. I need to add resulting image before merger. For this I want to train the model a bit more since bboxes are slightly shifted and it annoyed me 😄 the model can be better. In the meantime @stevhliu can you please review wording etc? 💗
{ "login": "merveenoyan", "id": 53175384, "node_id": "MDQ6VXNlcjUzMTc1Mzg0", "avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4", "gravatar_id": "", "url": "https://api.github.com/users/merveenoyan", "html_url": "https://github.com/merveenoyan", "followers_url": "https://api.github.com/users/merveenoyan/followers", "following_url": "https://api.github.com/users/merveenoyan/following{/other_user}", "gists_url": "https://api.github.com/users/merveenoyan/gists{/gist_id}", "starred_url": "https://api.github.com/users/merveenoyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/merveenoyan/subscriptions", "organizations_url": "https://api.github.com/users/merveenoyan/orgs", "repos_url": "https://api.github.com/users/merveenoyan/repos", "events_url": "https://api.github.com/users/merveenoyan/events{/privacy}", "received_events_url": "https://api.github.com/users/merveenoyan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41590/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41590/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41589
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41589/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41589/comments
https://api.github.com/repos/huggingface/transformers/issues/41589/events
https://github.com/huggingface/transformers/pull/41589
3,514,654,873
PR_kwDOCUB6oc6ttOyb
41,589
Allow VLMs to have a correct `base_model`
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T16:24:25
2025-10-20T09:31:14
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41589", "html_url": "https://github.com/huggingface/transformers/pull/41589", "diff_url": "https://github.com/huggingface/transformers/pull/41589.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41589.patch", "merged_at": null }
# What does this PR do? As per title, being able to simply call `model.base_model` is a useful feat for models which we didn't support in VLMs
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41589/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41589/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41588
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41588/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41588/comments
https://api.github.com/repos/huggingface/transformers/issues/41588/events
https://github.com/huggingface/transformers/pull/41588
3,514,648,522
PR_kwDOCUB6oc6ttNZ6
41,588
Remove `use_auth_token`
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T16:22:41
2025-10-17T13:03:00
2025-10-17T13:03:00
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41588", "html_url": "https://github.com/huggingface/transformers/pull/41588", "diff_url": "https://github.com/huggingface/transformers/pull/41588.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41588.patch", "merged_at": null }
# What does this PR do? This PR removes `use_auth_token` for v5
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41588/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41588/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41587
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41587/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41587/comments
https://api.github.com/repos/huggingface/transformers/issues/41587/events
https://github.com/huggingface/transformers/pull/41587
3,514,571,397
PR_kwDOCUB6oc6ts8Sv
41,587
remove ray_scope and check_quantized_param
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T16:02:30
2025-10-15T11:10:38
2025-10-15T11:10:35
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41587", "html_url": "https://github.com/huggingface/transformers/pull/41587", "diff_url": "https://github.com/huggingface/transformers/pull/41587.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41587.patch", "merged_at": "2025-10-15T11:10:35" }
# What does this PR do? This PR removes a few things that we don't need in v5.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41587/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41587/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41586
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41586/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41586/comments
https://api.github.com/repos/huggingface/transformers/issues/41586/events
https://github.com/huggingface/transformers/pull/41586
3,514,561,936
PR_kwDOCUB6oc6ts6Mv
41,586
Add fast path for bidirectional mask creation to fix regression
{ "login": "i3hz", "id": 144821361, "node_id": "U_kgDOCKHMcQ", "avatar_url": "https://avatars.githubusercontent.com/u/144821361?v=4", "gravatar_id": "", "url": "https://api.github.com/users/i3hz", "html_url": "https://github.com/i3hz", "followers_url": "https://api.github.com/users/i3hz/followers", "following_url": "https://api.github.com/users/i3hz/following{/other_user}", "gists_url": "https://api.github.com/users/i3hz/gists{/gist_id}", "starred_url": "https://api.github.com/users/i3hz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/i3hz/subscriptions", "organizations_url": "https://api.github.com/users/i3hz/orgs", "repos_url": "https://api.github.com/users/i3hz/repos", "events_url": "https://api.github.com/users/i3hz/events{/privacy}", "received_events_url": "https://api.github.com/users/i3hz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T16:00:10
2025-10-15T14:14:19
2025-10-15T13:30:39
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41586", "html_url": "https://github.com/huggingface/transformers/pull/41586", "diff_url": "https://github.com/huggingface/transformers/pull/41586.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41586.patch", "merged_at": "2025-10-15T13:30:39" }
# What does this PR do? This PR fixes the performance regression due to bidirectional masks. Fixes # (issue): 41566 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @vasqu @Cyrilvallez
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41586/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41586/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41585
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41585/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41585/comments
https://api.github.com/repos/huggingface/transformers/issues/41585/events
https://github.com/huggingface/transformers/pull/41585
3,514,408,526
PR_kwDOCUB6oc6tsZA3
41,585
[Trainer] [Breaking change] `use_cache` default to `False`
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T15:19:16
2025-10-16T16:51:38
2025-10-16T16:51:37
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41585", "html_url": "https://github.com/huggingface/transformers/pull/41585", "diff_url": "https://github.com/huggingface/transformers/pull/41585.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41585.patch", "merged_at": "2025-10-16T16:51:36" }
# What does this PR do? This PR adds a new arg that allows users to change the value of `use_cache` in the model config. For training, usually we don't need to use a cache and this created quite a lot of issues in the past (incompatibility, regression, lower perfs). Now that v5 is coming, let's do this change. cc @gante
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41585/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41585/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41584
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41584/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41584/comments
https://api.github.com/repos/huggingface/transformers/issues/41584/events
https://github.com/huggingface/transformers/pull/41584
3,514,265,577
PR_kwDOCUB6oc6tr5uh
41,584
Add clear error message for missing SentencePiece model in `get_spm_processor` (fix #41553)
{ "login": "AvinashDwivedi", "id": 86379589, "node_id": "MDQ6VXNlcjg2Mzc5NTg5", "avatar_url": "https://avatars.githubusercontent.com/u/86379589?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AvinashDwivedi", "html_url": "https://github.com/AvinashDwivedi", "followers_url": "https://api.github.com/users/AvinashDwivedi/followers", "following_url": "https://api.github.com/users/AvinashDwivedi/following{/other_user}", "gists_url": "https://api.github.com/users/AvinashDwivedi/gists{/gist_id}", "starred_url": "https://api.github.com/users/AvinashDwivedi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AvinashDwivedi/subscriptions", "organizations_url": "https://api.github.com/users/AvinashDwivedi/orgs", "repos_url": "https://api.github.com/users/AvinashDwivedi/repos", "events_url": "https://api.github.com/users/AvinashDwivedi/events{/privacy}", "received_events_url": "https://api.github.com/users/AvinashDwivedi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T14:44:34
2025-10-16T13:14:52
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41584", "html_url": "https://github.com/huggingface/transformers/pull/41584", "diff_url": "https://github.com/huggingface/transformers/pull/41584.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41584.patch", "merged_at": null }
Fix: clearer error when SentencePiece model file is missing for Llama tokenizer Fix: Improved error handling in get_spm_processor to provide a clear, actionable message when the required SentencePiece model file is missing. Before: The tokenizer raised an unhelpful low-level ```python TypeError: not a string ``` from the SentencePiece library when vocab_file was None. After: It now raises a concise, human-readable error: ```python ValueError: Missing SentencePiece model file: 'None'. This tokenizer requires a .model file provided by the 'mistral-common' package. Install it with: pip install mistral-common ``` Fixes: #41553 <img width="2000" height="1346" alt="Screenshot 2025-10-14 191309" src="https://github.com/user-attachments/assets/cb23a855-40d9-4d1b-8997-1138936c3eb9" />
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41584/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41584/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41583
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41583/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41583/comments
https://api.github.com/repos/huggingface/transformers/issues/41583/events
https://github.com/huggingface/transformers/pull/41583
3,514,192,027
PR_kwDOCUB6oc6trp3l
41,583
[kernels] Fix XPU layernorm kernel
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T14:25:18
2025-10-24T09:53:34
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41583", "html_url": "https://github.com/huggingface/transformers/pull/41583", "diff_url": "https://github.com/huggingface/transformers/pull/41583.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41583.patch", "merged_at": null }
# What does this PR do? Adds xpu layernorm kernel after pining the kernel version, and making sure the user has the latest version before registering the mapping that includes xpu device.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41583/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41583/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41582
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41582/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41582/comments
https://api.github.com/repos/huggingface/transformers/issues/41582/events
https://github.com/huggingface/transformers/pull/41582
3,514,173,063
PR_kwDOCUB6oc6trlxR
41,582
Update executorch.md
{ "login": "jackzhxng", "id": 32371937, "node_id": "MDQ6VXNlcjMyMzcxOTM3", "avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jackzhxng", "html_url": "https://github.com/jackzhxng", "followers_url": "https://api.github.com/users/jackzhxng/followers", "following_url": "https://api.github.com/users/jackzhxng/following{/other_user}", "gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions", "organizations_url": "https://api.github.com/users/jackzhxng/orgs", "repos_url": "https://api.github.com/users/jackzhxng/repos", "events_url": "https://api.github.com/users/jackzhxng/events{/privacy}", "received_events_url": "https://api.github.com/users/jackzhxng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T14:20:21
2025-10-15T16:01:47
2025-10-15T16:01:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41582", "html_url": "https://github.com/huggingface/transformers/pull/41582", "diff_url": "https://github.com/huggingface/transformers/pull/41582.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41582.patch", "merged_at": "2025-10-15T16:01:46" }
# What does this PR do? Update ExecuTorch docs, keep things simple and just redirect to Optimum repo. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @ArthurZucker @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41582/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41582/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41581
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41581/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41581/comments
https://api.github.com/repos/huggingface/transformers/issues/41581/events
https://github.com/huggingface/transformers/pull/41581
3,514,065,999
PR_kwDOCUB6oc6trOWv
41,581
Revert some breaking changes bnb
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T13:54:11
2025-10-14T14:28:18
2025-10-14T14:28:16
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41581", "html_url": "https://github.com/huggingface/transformers/pull/41581", "diff_url": "https://github.com/huggingface/transformers/pull/41581.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41581.patch", "merged_at": "2025-10-14T14:28:16" }
# What does this PR do? This PR fixes issues related to bnb during a recent refactor. I didn't push some modifications
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41581/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41581/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41580
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41580/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41580/comments
https://api.github.com/repos/huggingface/transformers/issues/41580/events
https://github.com/huggingface/transformers/pull/41580
3,513,997,457
PR_kwDOCUB6oc6tq_K9
41,580
Refactor weight loading
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T13:38:31
2025-10-30T07:55:46
null
COLLABORATOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41580", "html_url": "https://github.com/huggingface/transformers/pull/41580", "diff_url": "https://github.com/huggingface/transformers/pull/41580.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41580.patch", "merged_at": null }
# CORE REFACTORING, loading, converting, logging More helpful debugging report when loading weights <img width="2614" height="538" alt="image" src="https://github.com/user-attachments/assets/02df278f-10e0-4ca1-a35c-b09ee3fc67f9" /> ## Enable MoE quantization for FP8 This script does not work on main ```python import torch from transformers import MixtralForCausalLM, AutoTokenizer, FineGrainedFP8Config import time quantization_config = FineGrainedFP8Config(modules_to_not_convert=["model.layers.*.mlp.gate"]) model = MixtralForCausalLM.from_pretrained("mistralai/Mixtral-8x7B-v0.1", quantization_config=quantization_config, tp_plan="auto") ``` ## Enable TP + MoE without OOM This script does not work on main ```python model = MixtralForCausalLM.from_pretrained("mistralai/Mixtral-8x7B-v0.1", tp_plan="auto") ``` ## Enable `device_map="auto"` + MoE + FP8 This script does not work on main ```python quantization_config = FineGrainedFP8Config(modules_to_not_convert=["model.layers.*.mlp.gate"]) model = MixtralForCausalLM.from_pretrained("mistralai/Mixtral-8x7B-v0.1", quantization_config=quantization_config, device_map="auto") ``` ## Refactor the way we load weights, faster, flexible and better overall ### Uses staging buffers per conversion op - [x] 4x speedup with `device_map="auto"` - [x] Full `MoE` quantization with FP8 TODOS: - [x] Test with TP / EP - [x] Add TQDM! - [ ] Test with deepspeedd - [ ] Test with loras and peft - [ ] Test with vllm backend - [ ] Test with fsdp - [ ] Add saving Script: ```python import torch from torch import nn from transformers import MixtralForCausalLM, AutoTokenizer import time start = time.time() model = MixtralForCausalLM.from_pretrained("mistralai/Mixtral-8x7B-v0.1", device_map="auto") end = time.time() print("loading took ", end-start) tokenizer = AutoTokenizer.from_pretrained("mistralai/Mixtral-8x7B-v0.1") inputs = tokenizer("hey how are you?", return_tensors="pt").to(model.device) out = model.generate(**inputs, max_new_tokens=16) print(tokenizer.batch_decode(out)) ``` ```bash loading took 14.271092891693115 ['<s> hey how are you?\n\nI am a 20 year old male and I have been having'] ``` ⬆️ is with: merge modulelist, concat gate_up ⬇️ is naive loading. ```bash loading took 54.271092891693115 ['<s> hey how are you?\n\nI am a 20 year old male and I have been having'] ```
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41580/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/huggingface/transformers/issues/41580/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41579
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41579/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41579/comments
https://api.github.com/repos/huggingface/transformers/issues/41579/events
https://github.com/huggingface/transformers/pull/41579
3,513,988,710
PR_kwDOCUB6oc6tq9O7
41,579
Revert "add rmsnorm kernels support for Intel XPU"
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T13:36:42
2025-10-14T14:01:12
2025-10-14T13:49:33
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41579", "html_url": "https://github.com/huggingface/transformers/pull/41579", "diff_url": "https://github.com/huggingface/transformers/pull/41579.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41579.patch", "merged_at": "2025-10-14T13:49:33" }
Reverts huggingface/transformers#41563 Reverting to check what's happening
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41579/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41579/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41578
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41578/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41578/comments
https://api.github.com/repos/huggingface/transformers/issues/41578/events
https://github.com/huggingface/transformers/pull/41578
3,513,925,441
PR_kwDOCUB6oc6tqvNM
41,578
Fix import in quantization
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T13:22:01
2025-10-14T13:25:12
2025-10-14T13:25:12
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41578", "html_url": "https://github.com/huggingface/transformers/pull/41578", "diff_url": "https://github.com/huggingface/transformers/pull/41578.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41578.patch", "merged_at": null }
# What does this PR do? This PR fixes a small import that was introduced in https://github.com/huggingface/transformers/issues/41445.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41578/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41578/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41577
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41577/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41577/comments
https://api.github.com/repos/huggingface/transformers/issues/41577/events
https://github.com/huggingface/transformers/pull/41577
3,513,822,939
PR_kwDOCUB6oc6tqYmX
41,577
[kernels] refactor function kernel calling
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T12:58:37
2025-10-16T13:43:05
2025-10-16T13:43:03
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41577", "html_url": "https://github.com/huggingface/transformers/pull/41577", "diff_url": "https://github.com/huggingface/transformers/pull/41577.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41577.patch", "merged_at": "2025-10-16T13:43:03" }
# What does this PR do? This should simplify lazy kernel loading in Transformers. We simply define a mapping between each kernel name and the repository it should be pulled from, then load it using the `lazy_load_kernel` function. This function adds the kernel to a global cache shared across all models. If the kernel isn’t available, we check whether it’s installed as a module for backward compatibility; otherwise, we return `None`.
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41577/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41577/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41576
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41576/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41576/comments
https://api.github.com/repos/huggingface/transformers/issues/41576/events
https://github.com/huggingface/transformers/pull/41576
3,513,802,563
PR_kwDOCUB6oc6tqULG
41,576
Add missing dates to docs
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T12:53:10
2025-10-16T16:02:15
2025-10-16T09:32:28
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41576", "html_url": "https://github.com/huggingface/transformers/pull/41576", "diff_url": "https://github.com/huggingface/transformers/pull/41576.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41576.patch", "merged_at": "2025-10-16T09:32:28" }
Not sure how these slipped through, this should have broken repo-consistency. I'll investigate.
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41576/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41576/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41575
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41575/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41575/comments
https://api.github.com/repos/huggingface/transformers/issues/41575/events
https://github.com/huggingface/transformers/pull/41575
3,513,651,490
PR_kwDOCUB6oc6tpzCS
41,575
Fixed : Bad error message for AutoTokenizer loading Voxtral #41553
{ "login": "i3hz", "id": 144821361, "node_id": "U_kgDOCKHMcQ", "avatar_url": "https://avatars.githubusercontent.com/u/144821361?v=4", "gravatar_id": "", "url": "https://api.github.com/users/i3hz", "html_url": "https://github.com/i3hz", "followers_url": "https://api.github.com/users/i3hz/followers", "following_url": "https://api.github.com/users/i3hz/following{/other_user}", "gists_url": "https://api.github.com/users/i3hz/gists{/gist_id}", "starred_url": "https://api.github.com/users/i3hz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/i3hz/subscriptions", "organizations_url": "https://api.github.com/users/i3hz/orgs", "repos_url": "https://api.github.com/users/i3hz/repos", "events_url": "https://api.github.com/users/i3hz/events{/privacy}", "received_events_url": "https://api.github.com/users/i3hz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T12:12:47
2025-10-15T14:12:34
2025-10-15T14:12:33
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41575", "html_url": "https://github.com/huggingface/transformers/pull/41575", "diff_url": "https://github.com/huggingface/transformers/pull/41575.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41575.patch", "merged_at": null }
Fixes # 41553 Added a small check in both src/transformers/models/llama/tokenization_llama_fast.py and src/transformers/models/llama/tokenization_llama.py. Who can review? @ArthurZucker Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
{ "login": "i3hz", "id": 144821361, "node_id": "U_kgDOCKHMcQ", "avatar_url": "https://avatars.githubusercontent.com/u/144821361?v=4", "gravatar_id": "", "url": "https://api.github.com/users/i3hz", "html_url": "https://github.com/i3hz", "followers_url": "https://api.github.com/users/i3hz/followers", "following_url": "https://api.github.com/users/i3hz/following{/other_user}", "gists_url": "https://api.github.com/users/i3hz/gists{/gist_id}", "starred_url": "https://api.github.com/users/i3hz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/i3hz/subscriptions", "organizations_url": "https://api.github.com/users/i3hz/orgs", "repos_url": "https://api.github.com/users/i3hz/repos", "events_url": "https://api.github.com/users/i3hz/events{/privacy}", "received_events_url": "https://api.github.com/users/i3hz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41575/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41575/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41574
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41574/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41574/comments
https://api.github.com/repos/huggingface/transformers/issues/41574/events
https://github.com/huggingface/transformers/pull/41574
3,513,538,899
PR_kwDOCUB6oc6tpaCf
41,574
Fixed : Bad error message for AutoTokenizer loading Voxtral #41553
{ "login": "i3hz", "id": 144821361, "node_id": "U_kgDOCKHMcQ", "avatar_url": "https://avatars.githubusercontent.com/u/144821361?v=4", "gravatar_id": "", "url": "https://api.github.com/users/i3hz", "html_url": "https://github.com/i3hz", "followers_url": "https://api.github.com/users/i3hz/followers", "following_url": "https://api.github.com/users/i3hz/following{/other_user}", "gists_url": "https://api.github.com/users/i3hz/gists{/gist_id}", "starred_url": "https://api.github.com/users/i3hz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/i3hz/subscriptions", "organizations_url": "https://api.github.com/users/i3hz/orgs", "repos_url": "https://api.github.com/users/i3hz/repos", "events_url": "https://api.github.com/users/i3hz/events{/privacy}", "received_events_url": "https://api.github.com/users/i3hz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T11:42:23
2025-10-14T12:12:27
2025-10-14T12:08:58
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41574", "html_url": "https://github.com/huggingface/transformers/pull/41574", "diff_url": "https://github.com/huggingface/transformers/pull/41574.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41574.patch", "merged_at": null }
# What does this PR do? Fixes # 41553 Added a small check in both src/transformers/models/llama/tokenization_llama_fast.py and src/transformers/models/llama/tokenization_llama.py. ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @ArthurZucker Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
{ "login": "i3hz", "id": 144821361, "node_id": "U_kgDOCKHMcQ", "avatar_url": "https://avatars.githubusercontent.com/u/144821361?v=4", "gravatar_id": "", "url": "https://api.github.com/users/i3hz", "html_url": "https://github.com/i3hz", "followers_url": "https://api.github.com/users/i3hz/followers", "following_url": "https://api.github.com/users/i3hz/following{/other_user}", "gists_url": "https://api.github.com/users/i3hz/gists{/gist_id}", "starred_url": "https://api.github.com/users/i3hz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/i3hz/subscriptions", "organizations_url": "https://api.github.com/users/i3hz/orgs", "repos_url": "https://api.github.com/users/i3hz/repos", "events_url": "https://api.github.com/users/i3hz/events{/privacy}", "received_events_url": "https://api.github.com/users/i3hz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41574/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41574/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41573
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41573/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41573/comments
https://api.github.com/repos/huggingface/transformers/issues/41573/events
https://github.com/huggingface/transformers/pull/41573
3,513,472,068
PR_kwDOCUB6oc6tpLZy
41,573
Update issue template
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T11:20:33
2025-10-15T11:54:40
2025-10-15T11:54:38
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41573", "html_url": "https://github.com/huggingface/transformers/pull/41573", "diff_url": "https://github.com/huggingface/transformers/pull/41573.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41573.patch", "merged_at": "2025-10-15T11:54:37" }
# What does this PR do? This PR updates the issue template.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41573/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41573/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41572
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41572/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41572/comments
https://api.github.com/repos/huggingface/transformers/issues/41572/events
https://github.com/huggingface/transformers/pull/41572
3,513,416,655
PR_kwDOCUB6oc6to_V_
41,572
Gemma3 fixes
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T11:02:36
2025-10-14T16:33:27
2025-10-14T16:33:27
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41572", "html_url": "https://github.com/huggingface/transformers/pull/41572", "diff_url": "https://github.com/huggingface/transformers/pull/41572.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41572.patch", "merged_at": "2025-10-14T16:33:27" }
This PR fixes three things in `gemma3`: - a multiple-device error where `torch.where` takes some of its coefficients from a tensor that is not on the right device and is a full_like, so we just replace it with the filling element - an error in the `flash_attn_inference_equivalence` which is due to the model needing more parameters than are generated by defualt. To avoid this, we add a flag that specifies if we need to check the forward pass with training or not, and make this check default for both and left padding (cc. @vasqu ) - the test `flash_attn_from_config` was failing for the same reasons (`token_type_ids` is required as a model input when training) so I added a `.eval()` to avoid this. It does not seem the model needs to be in train mode for this test, but I can also add an option to the test to only call `.eval()` if a flag is passed
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41572/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41572/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41571
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41571/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41571/comments
https://api.github.com/repos/huggingface/transformers/issues/41571/events
https://github.com/huggingface/transformers/pull/41571
3,513,384,860
PR_kwDOCUB6oc6to4SB
41,571
Fix an import error with PreTrainModel
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T10:52:50
2025-10-14T11:13:37
2025-10-14T11:13:37
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41571", "html_url": "https://github.com/huggingface/transformers/pull/41571", "diff_url": "https://github.com/huggingface/transformers/pull/41571.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41571.patch", "merged_at": "2025-10-14T11:13:37" }
This tiny tiny PR fixes an error introduced in #41445 which led to an import error. For instance, `tests/models/gemma3/test_modeling_gemma3.py::Gemma3Vision2TextModelTest::test_flash_attn_2_fp32_ln` without this fix fails with ``` NameError: name 'PreTrainedModel' is not defined ``` but after this fix passes
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41571/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41571/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41570
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41570/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41570/comments
https://api.github.com/repos/huggingface/transformers/issues/41570/events
https://github.com/huggingface/transformers/issues/41570
3,513,383,746
I_kwDOCUB6oc7RaftC
41,570
Uneven GPU memory usage and CUDA OOM during multi-GPU inference with Qwen3-VL-30B-A3B-Thinking
{ "login": "Xqle", "id": 87457840, "node_id": "MDQ6VXNlcjg3NDU3ODQw", "avatar_url": "https://avatars.githubusercontent.com/u/87457840?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Xqle", "html_url": "https://github.com/Xqle", "followers_url": "https://api.github.com/users/Xqle/followers", "following_url": "https://api.github.com/users/Xqle/following{/other_user}", "gists_url": "https://api.github.com/users/Xqle/gists{/gist_id}", "starred_url": "https://api.github.com/users/Xqle/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Xqle/subscriptions", "organizations_url": "https://api.github.com/users/Xqle/orgs", "repos_url": "https://api.github.com/users/Xqle/repos", "events_url": "https://api.github.com/users/Xqle/events{/privacy}", "received_events_url": "https://api.github.com/users/Xqle/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T10:52:25
2025-10-16T10:40:57
null
CONTRIBUTOR
null
null
null
null
When performing inference on a ~3-minute video using the Qwen3-VL-30B-A3B-Thinking model from transformers in 8*3090(24 GB) and setting `fps=1`, a CUDA Out-of-Memory (OOM) error occurs.  ``` Traceback (most recent call last):       File "/data01/xuqile/code/Inference/transformers/infer_qwen3vl_transformers.py", line 94, in <module>         main()       File "/data01/xuqile/code/Inference/transformers/infer_qwen3vl_transformers.py", line 78, in main         generated_ids = model.generate(**inputs, max_new_tokens=args.max_tokens)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context         return func(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/generation/utils.py", line 2564, in generate         result = decoding_method(       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/generation/utils.py", line 2784, in _sample         outputs = self(**model_inputs, return_dict=True)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/accelerate/hooks.py", line 175, in new_forward         output = module._old_forward(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/utils/generic.py", line 1064, in wrapper         outputs = func(self, *args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 1601, in forward         outputs = self.model(       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/utils/generic.py", line 1064, in wrapper         outputs = func(self, *args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 1389, in forward         outputs = self.language_model(       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/utils/generic.py", line 1064, in wrapper         outputs = func(self, *args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 962, in forward         layer_outputs = decoder_layer(       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/modeling_layers.py", line 94, in __call__         return super().__call__(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/accelerate/hooks.py", line 175, in new_forward         output = module._old_forward(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func         return func(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 391, in forward         hidden_states = self.mlp(hidden_states)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/accelerate/hooks.py", line 175, in new_forward         output = module._old_forward(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 151, in forward         routed_out = self.experts(hidden_states, router_weights, router_indices)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl         return self._call_impl(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl         return forward_call(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/accelerate/hooks.py", line 175, in new_forward         output = module._old_forward(*args, **kwargs)       File "/data01/xuqile/miniforge3/envs/qwen3vl/lib/python3.10/site-packages/transformers/models/qwen3_vl_moe/modeling_qwen3_vl_moe.py", line 120, in forward         next_states = torch.bmm((up * self.act_fn(gate)), self.down_proj)       torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 5.06 GiB. GPU 0 has a total capacity of 23.68 GiB of which 4.40 GiB is free. Including non-PyTorch memory, this process has 19.27 GiB memory in use. Of the allocated memory 18.67 GiB is allocated by PyTorch, and 304.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables) ``` However, GPU monitoring shows that only one GPU has extremely high memory usage, while the others remain underutilized (less than 50% memory used). <img width="560" height="560" alt="Image" src="https://github.com/user-attachments/assets/9c810b25-fa92-47af-a008-7f9ae79d7ac5" /> After investigation, the memory surge appears to happen inside `Qwen3VLMoeTextExperts.forward`:  <img width="640" height="302" alt="Image" src="https://github.com/user-attachments/assets/9d4cda70-4e0a-4907-8f39-368b2ae366ea" /> Is there a known workaround for this, or will future releases include optimization or DTensor support for the MoE expert modules to improve memory distribution during multi-GPU inference?
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41570/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41570/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41569
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41569/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41569/comments
https://api.github.com/repos/huggingface/transformers/issues/41569/events
https://github.com/huggingface/transformers/pull/41569
3,513,372,038
PR_kwDOCUB6oc6to1bl
41,569
Add __iter__ to DynamicCache
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T10:48:50
2025-10-15T13:19:38
2025-10-14T14:16:32
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41569", "html_url": "https://github.com/huggingface/transformers/pull/41569", "diff_url": "https://github.com/huggingface/transformers/pull/41569.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41569.patch", "merged_at": "2025-10-14T14:16:32" }
Currently, DDP is broken when there is a `DynamicCache` because it has no `__iter___` method and so it cannot be concatenated after the distributed forward. This PR adds back and `__iter__` and adapts the way ddp data is consumed to properly initialize sliding windows.
{ "login": "remi-or", "id": 83456801, "node_id": "MDQ6VXNlcjgzNDU2ODAx", "avatar_url": "https://avatars.githubusercontent.com/u/83456801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remi-or", "html_url": "https://github.com/remi-or", "followers_url": "https://api.github.com/users/remi-or/followers", "following_url": "https://api.github.com/users/remi-or/following{/other_user}", "gists_url": "https://api.github.com/users/remi-or/gists{/gist_id}", "starred_url": "https://api.github.com/users/remi-or/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/remi-or/subscriptions", "organizations_url": "https://api.github.com/users/remi-or/orgs", "repos_url": "https://api.github.com/users/remi-or/repos", "events_url": "https://api.github.com/users/remi-or/events{/privacy}", "received_events_url": "https://api.github.com/users/remi-or/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41569/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41569/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41568
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41568/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41568/comments
https://api.github.com/repos/huggingface/transformers/issues/41568/events
https://github.com/huggingface/transformers/pull/41568
3,513,099,388
PR_kwDOCUB6oc6tn6pe
41,568
Use `yaml.safe_load`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T09:23:51
2025-10-27T12:45:10
2025-10-27T12:45:10
COLLABORATOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41568", "html_url": "https://github.com/huggingface/transformers/pull/41568", "diff_url": "https://github.com/huggingface/transformers/pull/41568.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41568.patch", "merged_at": null }
# What does this PR do? Use `yaml.safe_load` instead of `yaml.load` to avoid potential security issue, and to close the security report we received. https://pyyaml.org/wiki/PyYAMLDocumentation > Warning: It is not safe to call yaml.load with any data received from an untrusted source! yaml.load is as powerful as pickle.load and so may call any Python function. Check the yaml.safe_load function though.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41568/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41568/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41567
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41567/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41567/comments
https://api.github.com/repos/huggingface/transformers/issues/41567/events
https://github.com/huggingface/transformers/pull/41567
3,512,764,418
PR_kwDOCUB6oc6tmyq3
41,567
[WIP] Fully deprecate AutoGPTQ for GPT-QModel
{ "login": "Qubitium", "id": 417764, "node_id": "MDQ6VXNlcjQxNzc2NA==", "avatar_url": "https://avatars.githubusercontent.com/u/417764?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Qubitium", "html_url": "https://github.com/Qubitium", "followers_url": "https://api.github.com/users/Qubitium/followers", "following_url": "https://api.github.com/users/Qubitium/following{/other_user}", "gists_url": "https://api.github.com/users/Qubitium/gists{/gist_id}", "starred_url": "https://api.github.com/users/Qubitium/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Qubitium/subscriptions", "organizations_url": "https://api.github.com/users/Qubitium/orgs", "repos_url": "https://api.github.com/users/Qubitium/repos", "events_url": "https://api.github.com/users/Qubitium/events{/privacy}", "received_events_url": "https://api.github.com/users/Qubitium/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T07:46:11
2025-10-15T12:57:21
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41567", "html_url": "https://github.com/huggingface/transformers/pull/41567", "diff_url": "https://github.com/huggingface/transformers/pull/41567.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41567.patch", "merged_at": null }
Remove autogptq clutter and autogptq related configs that are not worth adding backward compat. GPTQModel has a slight project name change (pypi package and import name stays the same) to GPT-QModel with `-` as we now have added `awq`/AutoAWQ into our repo and will be making pr soon to address awq loading using GPT-QModel. `GPTQConfig` has the most important changes in this PR: ```python # New GPTQConfig Property. Applicable for sister Peft/Optimum PRs act_group_aware (`bool`, *optional*, defaults to `True`): Use GAR (group aware activation order) during quantization. Has measurable positive impact on quantization quality. Only applicable when `desc_act = False`. Will forced to be `False` when `desc_act = True`. # Removed GPTQConfig Properties: use_cuda_fp16 use_exllama exllama_config ``` The 3 removed properties are all related `kernel` selection. These 3 are a hot potatoe mess and legacy from autogptq. GPT-QModel uses unified `backend` (existing) property to select kernels. There were compat codes written to `convert` these 3 properties to `backend` behind the scenes in 2024 but no longer relevant for 2025. Note: * Transformers/Optimum/Peft CI tests should never check for `kernel.QUANT_TYPE` (str). GPTQ-QModel will return best performing kernel for the relevant module and it may be different per module due to in/out features and other gptq/module properties in relation to device type + dtype + many factors. * CI tests should only assert check for `kernel.QUANT_TYPE` if the test specifies a specific kernel via `backend` selection.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41567/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41567/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41566
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41566/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41566/comments
https://api.github.com/repos/huggingface/transformers/issues/41566/events
https://github.com/huggingface/transformers/issues/41566
3,512,546,971
I_kwDOCUB6oc7RXTab
41,566
Performance regression due to Bidirectional masks
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-14T06:29:37
2025-10-15T13:34:41
2025-10-15T13:34:41
CONTRIBUTOR
null
null
null
null
### System Info torch 2.10.0.dev20251008+cpu ### Who can help? @vasqu @zucchini-nlp ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction `numactl -C 0-7 --membind 0 python test_bert.py` ```python import torch import time from transformers import pipeline model_id = "sentence-transformers/all-MiniLM-L6-v2" sentences = ["This is an example sentence", "Each sentence is converted"] extractor = pipeline("feature-extraction", model=model_id, dtype=torch.float16) encoded_input = extractor.tokenizer(sentences, padding=True, truncation=True, return_tensors="pt") # warm up for _ in range(10): model_output = extractor(sentences, return_tensors=True, batch_size=2) for _ in range(10): start = time.time() model_output = extractor(sentences, return_tensors=True, batch_size=2) end = time.time() print(f"latency: {(end-start)*1000} ms") ``` ### Expected behavior The performance has 40% regression after PR [41265](https://github.com/huggingface/transformers/pull/41265)
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41566/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41566/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41565
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41565/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41565/comments
https://api.github.com/repos/huggingface/transformers/issues/41565/events
https://github.com/huggingface/transformers/pull/41565
3,512,541,791
PR_kwDOCUB6oc6tmDH0
41,565
🌐 [i18n-KO] Updated `perf_train_gpu_many.md`
{ "login": "D15M4S", "id": 122260287, "node_id": "U_kgDOB0mLPw", "avatar_url": "https://avatars.githubusercontent.com/u/122260287?v=4", "gravatar_id": "", "url": "https://api.github.com/users/D15M4S", "html_url": "https://github.com/D15M4S", "followers_url": "https://api.github.com/users/D15M4S/followers", "following_url": "https://api.github.com/users/D15M4S/following{/other_user}", "gists_url": "https://api.github.com/users/D15M4S/gists{/gist_id}", "starred_url": "https://api.github.com/users/D15M4S/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/D15M4S/subscriptions", "organizations_url": "https://api.github.com/users/D15M4S/orgs", "repos_url": "https://api.github.com/users/D15M4S/repos", "events_url": "https://api.github.com/users/D15M4S/events{/privacy}", "received_events_url": "https://api.github.com/users/D15M4S/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T06:27:30
2025-10-20T01:01:18
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41565", "html_url": "https://github.com/huggingface/transformers/pull/41565", "diff_url": "https://github.com/huggingface/transformers/pull/41565.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41565.patch", "merged_at": null }
<!-- PR의 제목은 "🌐 [i18n-KO] Updated ko/perf_train_gpu_many.md" 으로 부탁드립니다 --> # What does this PR do? Updated the perf_train_special.md file in the documentation with a Korean translation. Thank you in advance for your review. Part of #20179 Update of #26244 ## Before reviewing - [x] Check for missing / redundant translations (번역 누락/중복 검사) - [x] Grammar Check (맞춤법 검사) - [x] Review or Add new terms to glossary (용어 확인 및 추가) - [x] Check Inline TOC (e.g. `[[lowercased-header]]`) - [x] Check live-preview for gotchas (live-preview로 정상작동 확인) ## Who can review? (Initial) <!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!--> May you please review this PR? @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @harheem, @4N3MONE, @yijun-lee ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? (Final) <!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! --> <!-- @stevhliu May you please review this PR? -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41565/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41565/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41564
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41564/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41564/comments
https://api.github.com/repos/huggingface/transformers/issues/41564/events
https://github.com/huggingface/transformers/pull/41564
3,512,509,328
PR_kwDOCUB6oc6tl8Lj
41,564
Add aux loss for GLM-4.5V
{ "login": "zRzRzRzRzRzRzR", "id": 93239683, "node_id": "U_kgDOBY65gw", "avatar_url": "https://avatars.githubusercontent.com/u/93239683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zRzRzRzRzRzRzR", "html_url": "https://github.com/zRzRzRzRzRzRzR", "followers_url": "https://api.github.com/users/zRzRzRzRzRzRzR/followers", "following_url": "https://api.github.com/users/zRzRzRzRzRzRzR/following{/other_user}", "gists_url": "https://api.github.com/users/zRzRzRzRzRzRzR/gists{/gist_id}", "starred_url": "https://api.github.com/users/zRzRzRzRzRzRzR/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zRzRzRzRzRzRzR/subscriptions", "organizations_url": "https://api.github.com/users/zRzRzRzRzRzRzR/orgs", "repos_url": "https://api.github.com/users/zRzRzRzRzRzRzR/repos", "events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/events{/privacy}", "received_events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T06:13:28
2025-10-16T09:04:21
2025-10-16T09:04:21
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41564", "html_url": "https://github.com/huggingface/transformers/pull/41564", "diff_url": "https://github.com/huggingface/transformers/pull/41564.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41564.patch", "merged_at": "2025-10-16T09:04:21" }
# What does this PR do? Add Aux Loss for GLM-4.5V
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41564/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41564/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41563
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41563/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41563/comments
https://api.github.com/repos/huggingface/transformers/issues/41563/events
https://github.com/huggingface/transformers/pull/41563
3,512,454,604
PR_kwDOCUB6oc6tlwgo
41,563
add rmsnorm kernels support for Intel XPU
{ "login": "kaixuanliu", "id": 13268042, "node_id": "MDQ6VXNlcjEzMjY4MDQy", "avatar_url": "https://avatars.githubusercontent.com/u/13268042?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kaixuanliu", "html_url": "https://github.com/kaixuanliu", "followers_url": "https://api.github.com/users/kaixuanliu/followers", "following_url": "https://api.github.com/users/kaixuanliu/following{/other_user}", "gists_url": "https://api.github.com/users/kaixuanliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/kaixuanliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kaixuanliu/subscriptions", "organizations_url": "https://api.github.com/users/kaixuanliu/orgs", "repos_url": "https://api.github.com/users/kaixuanliu/repos", "events_url": "https://api.github.com/users/kaixuanliu/events{/privacy}", "received_events_url": "https://api.github.com/users/kaixuanliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T05:49:13
2025-10-14T13:26:54
2025-10-14T13:26:10
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41563", "html_url": "https://github.com/huggingface/transformers/pull/41563", "diff_url": "https://github.com/huggingface/transformers/pull/41563.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41563.patch", "merged_at": "2025-10-14T13:26:10" }
@MekkCyber @drbh pls help review, thx! We did a benchmark using following scripts on Intel XPU PVC Max 1550: ``` import gc import torch import time import transformers from transformers import AutoConfig, AutoProcessor, GenerationConfig, set_seed, StaticCache device = torch.device("xpu") data_type = torch.float16 model_id = "Qwen/Qwen3-4B-Instruct-2507" batch_size = 8 max_input_length = 512 max_completion_length = 2048 set_seed(42) prompt = "SUBREDDIT: r/relationships\n\nTITLE: I (f/22) have to figure out if I want to still know these girls or not and would hate to sound insulting\n\nPOST: Not sure if this belongs here but it's worth a try. \n\nBackstory:\nWhen I (f/22) went through my first real breakup 2 years ago because he needed space after a year of dating roand it effected me more than I thought. It was a horrible time in my life due to living with my mother and finally having the chance to cut her out of my life. I can admit because of it was an emotional wreck and this guy was stable and didn't know how to deal with me. We ended by him avoiding for a month or so after going to a festival with my friends. When I think back I wish he just ended. So after he ended it added my depression I suffered but my friends helped me through it and I got rid of everything from him along with cutting contact. \n\nNow: Its been almost 3 years now and I've gotten better after counselling and mild anti depressants. My mother has been out of my life since then so there's been alot of progress. Being stronger after learning some lessons there been more insight about that time of my life but when I see him or a picture everything comes back. The emotions and memories bring me back down. \n\nHis friends (both girls) are on my facebook because we get along well which is hard to find and I know they'll always have his back. But seeing him in a picture or talking to him at a convention having a conversation is tough. Crying confront of my current boyfriend is something I want to avoid. \n\nSo I've been thinking that I have to cut contact with these girls because it's time to move on because it's healthier. It's best to avoid him as well. But will they be insulted? Will they accept it? Is there going to be awkwardness? I'm not sure if it's the right to do and could use some outside opinions.\n\nTL;DR:" config = AutoConfig.from_pretrained(model_id) tokenizer = AutoProcessor.from_pretrained(model_id) architecture = getattr(transformers, config.architectures[0]) transformers_model = architecture.from_pretrained(model_id, use_kernels=True, device_map=device).to(data_type) generation_kwargs = { "max_new_tokens": max_completion_length, "do_sample": True, "pad_token_id": tokenizer.pad_token_id, "bos_token_id": tokenizer.bos_token_id, "eos_token_id": tokenizer.eos_token_id, "temperature": 1.0, "cache_implementation": "static", } generation_config = GenerationConfig(**generation_kwargs) prompts = [prompt] * batch_size inputs = tokenizer( text=prompts, return_tensors="pt", padding=True, padding_side="left", add_special_tokens=False ).to(device) warmup_steps = 2 run_steps = 5 for _ in range(warmup_steps): with torch.no_grad(): outputs = transformers_model.generate(**inputs, generation_config=generation_config, disable_compile=True) for _ in range(run_steps): torch.xpu.synchronize() time_s = time.time() with torch.no_grad(): outputs = transformers_model.generate(**inputs, generation_config=generation_config, disable_compile=True) torch.xpu.synchronize() time_e = time.time() print(f"Transformers: {time_e - time_s:.4f} seconds") del tokenizer del transformers_model gc.collect() torch.xpu.empty_cache() ``` result shows the avg latency drops from 108s to 77s w/ this PR.
{ "login": "MekkCyber", "id": 93391238, "node_id": "U_kgDOBZEJhg", "avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MekkCyber", "html_url": "https://github.com/MekkCyber", "followers_url": "https://api.github.com/users/MekkCyber/followers", "following_url": "https://api.github.com/users/MekkCyber/following{/other_user}", "gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}", "starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions", "organizations_url": "https://api.github.com/users/MekkCyber/orgs", "repos_url": "https://api.github.com/users/MekkCyber/repos", "events_url": "https://api.github.com/users/MekkCyber/events{/privacy}", "received_events_url": "https://api.github.com/users/MekkCyber/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41563/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41563/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41562
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41562/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41562/comments
https://api.github.com/repos/huggingface/transformers/issues/41562/events
https://github.com/huggingface/transformers/issues/41562
3,512,421,249
I_kwDOCUB6oc7RW0uB
41,562
NameError: name 'PreTrainedModel' is not defined
{ "login": "poilly54", "id": 227551242, "node_id": "U_kgDODZAoCg", "avatar_url": "https://avatars.githubusercontent.com/u/227551242?v=4", "gravatar_id": "", "url": "https://api.github.com/users/poilly54", "html_url": "https://github.com/poilly54", "followers_url": "https://api.github.com/users/poilly54/followers", "following_url": "https://api.github.com/users/poilly54/following{/other_user}", "gists_url": "https://api.github.com/users/poilly54/gists{/gist_id}", "starred_url": "https://api.github.com/users/poilly54/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/poilly54/subscriptions", "organizations_url": "https://api.github.com/users/poilly54/orgs", "repos_url": "https://api.github.com/users/poilly54/repos", "events_url": "https://api.github.com/users/poilly54/events{/privacy}", "received_events_url": "https://api.github.com/users/poilly54/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-14T05:31:23
2025-10-14T14:39:03
2025-10-14T14:39:02
NONE
null
null
null
null
### System Info - ubuntu22.04 docker image. - python3.10 - torch2.7.0-cu128 - transformers-5.0.0.dev0 --- I was able to load the model fine until yesterday, but suddenly it stopped working with the error below. > <img width="1063" height="237" alt="Image" src="https://github.com/user-attachments/assets/659de62d-089c-4a98-b7a5-6c80fd6a5bdf" /> I did not change any code because I use docker container. I installed transformers-5.0.0.dev0 using the following command: > `pip install git+https://github.com/huggingface/transformers`
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41562/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41562/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41561
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41561/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41561/comments
https://api.github.com/repos/huggingface/transformers/issues/41561/events
https://github.com/huggingface/transformers/pull/41561
3,512,322,704
PR_kwDOCUB6oc6tlVL3
41,561
Optimize Mamba2 memory usage by replacing broadcast with einsum
{ "login": "SohamWalam11", "id": 147302133, "node_id": "U_kgDOCMem9Q", "avatar_url": "https://avatars.githubusercontent.com/u/147302133?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SohamWalam11", "html_url": "https://github.com/SohamWalam11", "followers_url": "https://api.github.com/users/SohamWalam11/followers", "following_url": "https://api.github.com/users/SohamWalam11/following{/other_user}", "gists_url": "https://api.github.com/users/SohamWalam11/gists{/gist_id}", "starred_url": "https://api.github.com/users/SohamWalam11/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SohamWalam11/subscriptions", "organizations_url": "https://api.github.com/users/SohamWalam11/orgs", "repos_url": "https://api.github.com/users/SohamWalam11/repos", "events_url": "https://api.github.com/users/SohamWalam11/events{/privacy}", "received_events_url": "https://api.github.com/users/SohamWalam11/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T04:35:52
2025-10-20T17:11:38
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41561", "html_url": "https://github.com/huggingface/transformers/pull/41561", "diff_url": "https://github.com/huggingface/transformers/pull/41561.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41561.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41561/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41561/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41560
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41560/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41560/comments
https://api.github.com/repos/huggingface/transformers/issues/41560/events
https://github.com/huggingface/transformers/issues/41560
3,512,322,322
I_kwDOCUB6oc7RWckS
41,560
Trainer._algin_special_token can't figure out tokenizezr_has_new_eos correctly when model.generation_config.eos_token_id is a list
{ "login": "WayenVan", "id": 32873268, "node_id": "MDQ6VXNlcjMyODczMjY4", "avatar_url": "https://avatars.githubusercontent.com/u/32873268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/WayenVan", "html_url": "https://github.com/WayenVan", "followers_url": "https://api.github.com/users/WayenVan/followers", "following_url": "https://api.github.com/users/WayenVan/following{/other_user}", "gists_url": "https://api.github.com/users/WayenVan/gists{/gist_id}", "starred_url": "https://api.github.com/users/WayenVan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/WayenVan/subscriptions", "organizations_url": "https://api.github.com/users/WayenVan/orgs", "repos_url": "https://api.github.com/users/WayenVan/repos", "events_url": "https://api.github.com/users/WayenVan/events{/privacy}", "received_events_url": "https://api.github.com/users/WayenVan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-10-14T04:35:38
2025-10-14T04:42:48
2025-10-14T04:42:48
NONE
null
null
null
null
### System Info Refering the code, when algin the special tokens: ``` tokenizer_has_new_eos = tokenizer.eos_token_ids != self.model.generation_config.eos_token_id ``` here if we have a list in model's generation_config, then this value is always false thus duplicate eos_token_id will be introduced in the eos_token_id in target generation_config ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction NOne ### Expected behavior if the eos_token_id is in model's generation_config, tokenizer_has_new_eos should be True
{ "login": "WayenVan", "id": 32873268, "node_id": "MDQ6VXNlcjMyODczMjY4", "avatar_url": "https://avatars.githubusercontent.com/u/32873268?v=4", "gravatar_id": "", "url": "https://api.github.com/users/WayenVan", "html_url": "https://github.com/WayenVan", "followers_url": "https://api.github.com/users/WayenVan/followers", "following_url": "https://api.github.com/users/WayenVan/following{/other_user}", "gists_url": "https://api.github.com/users/WayenVan/gists{/gist_id}", "starred_url": "https://api.github.com/users/WayenVan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/WayenVan/subscriptions", "organizations_url": "https://api.github.com/users/WayenVan/orgs", "repos_url": "https://api.github.com/users/WayenVan/repos", "events_url": "https://api.github.com/users/WayenVan/events{/privacy}", "received_events_url": "https://api.github.com/users/WayenVan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41560/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41560/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/41559
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41559/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41559/comments
https://api.github.com/repos/huggingface/transformers/issues/41559/events
https://github.com/huggingface/transformers/pull/41559
3,512,221,626
PR_kwDOCUB6oc6tk_vT
41,559
Fix executorch export with dynamic shapes
{ "login": "justinchuby", "id": 11205048, "node_id": "MDQ6VXNlcjExMjA1MDQ4", "avatar_url": "https://avatars.githubusercontent.com/u/11205048?v=4", "gravatar_id": "", "url": "https://api.github.com/users/justinchuby", "html_url": "https://github.com/justinchuby", "followers_url": "https://api.github.com/users/justinchuby/followers", "following_url": "https://api.github.com/users/justinchuby/following{/other_user}", "gists_url": "https://api.github.com/users/justinchuby/gists{/gist_id}", "starred_url": "https://api.github.com/users/justinchuby/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/justinchuby/subscriptions", "organizations_url": "https://api.github.com/users/justinchuby/orgs", "repos_url": "https://api.github.com/users/justinchuby/repos", "events_url": "https://api.github.com/users/justinchuby/events{/privacy}", "received_events_url": "https://api.github.com/users/justinchuby/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T03:33:50
2025-10-28T21:26:37
null
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41559", "html_url": "https://github.com/huggingface/transformers/pull/41559", "diff_url": "https://github.com/huggingface/transformers/pull/41559.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41559.patch", "merged_at": null }
# What does this PR do? This PR addd shape compatibility check for attention mask to ensure torch.export can reason about the logic without failing _when exporting with dynamic shapes_. It additionally simplified the sdpa forward function for export only usage. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @CyrilVallez @jackzhxng @guangy10
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41559/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41559/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41558
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41558/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41558/comments
https://api.github.com/repos/huggingface/transformers/issues/41558/events
https://github.com/huggingface/transformers/pull/41558
3,512,172,701
PR_kwDOCUB6oc6tk1yV
41,558
fix some case failures lead by "`torch.compile` recompiled part of th…
{ "login": "sywangyi", "id": 36058628, "node_id": "MDQ6VXNlcjM2MDU4NjI4", "avatar_url": "https://avatars.githubusercontent.com/u/36058628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sywangyi", "html_url": "https://github.com/sywangyi", "followers_url": "https://api.github.com/users/sywangyi/followers", "following_url": "https://api.github.com/users/sywangyi/following{/other_user}", "gists_url": "https://api.github.com/users/sywangyi/gists{/gist_id}", "starred_url": "https://api.github.com/users/sywangyi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sywangyi/subscriptions", "organizations_url": "https://api.github.com/users/sywangyi/orgs", "repos_url": "https://api.github.com/users/sywangyi/repos", "events_url": "https://api.github.com/users/sywangyi/events{/privacy}", "received_events_url": "https://api.github.com/users/sywangyi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T03:02:06
2025-10-15T10:49:22
2025-10-15T10:45:29
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41558", "html_url": "https://github.com/huggingface/transformers/pull/41558", "diff_url": "https://github.com/huggingface/transformers/pull/41558.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41558.patch", "merged_at": "2025-10-15T10:45:29" }
…e forward pass" in xpu # What does this PR do? fix test failure. including cases include tests/models/chameleon/test_modeling_chameleon.py::ChameleonVision2SeqModelTest::test_generate_compile_model_forward_fullgraph tests/models/dia/test_modeling_dia.py::DiaModelTest::test_generate_compile_model_forward_fullgraph tests/models/emu3/test_modeling_emu3.py::Emu3Text2TextModelTest::test_generate_compile_model_forward_fullgraph tests/models/emu3/test_modeling_emu3.py::Emu3Vision2TextModelTest::test_generate_compile_model_forward_fullgraph tests/models/glm4v/test_modeling_glm4v.py::Glm4vModelTest::test_generate_compile_model_forward_fullgraph tests/models/got_ocr2/test_modeling_got_ocr2.py::GotOcr2ModelTest::test_generate_compile_model_forward_fullgraph tests/models/gpt2/test_modeling_gpt2.py::GPT2ModelTest::test_generate_compile_model_forward_fullgraph tests/models/internvl/test_modeling_internvl.py::InternVLModelTest::test_generate_compile_model_forward_fullgraph tests/models/llava/test_modeling_llava.py::LlavaForConditionalGenerationModelTest::test_generate_compile_model_forward_fullgraph tests/models/mistral3/test_modeling_mistral3.py::Mistral3ModelTest::test_generate_compile_model_forward_fullgraph tests/models/ovis2/test_modeling_ovis2.py::Ovis2ModelTest::test_generate_compile_model_forward_fullgraph tests/models/perception_lm/test_modeling_perception_lm.py::PerceptionLMForConditionalGenerationModelTest::test_generate_compile_model_forward_fullgraph tests/models/qwen2_5_vl/test_modeling_qwen2_5_vl.py::Qwen2_5_VLModelTest::test_generate_compile_model_forward_fullgraph tests/models/vipllava/test_modeling_vipllava.py::VipLlavaForConditionalGenerationModelTest::test_generate_compile_model_forward_fullgraph tests/models/whisper/test_modeling_whisper.py::WhisperStandaloneDecoderModelTest::test_generate_compile_model_forward_fullgraph @IlyasMoutawwakil
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41558/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41558/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41557
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41557/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41557/comments
https://api.github.com/repos/huggingface/transformers/issues/41557/events
https://github.com/huggingface/transformers/pull/41557
3,512,126,327
PR_kwDOCUB6oc6tksR0
41,557
Update modeling_llama4.py
{ "login": "alfredo-etched", "id": 204419130, "node_id": "U_kgDODC8wOg", "avatar_url": "https://avatars.githubusercontent.com/u/204419130?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alfredo-etched", "html_url": "https://github.com/alfredo-etched", "followers_url": "https://api.github.com/users/alfredo-etched/followers", "following_url": "https://api.github.com/users/alfredo-etched/following{/other_user}", "gists_url": "https://api.github.com/users/alfredo-etched/gists{/gist_id}", "starred_url": "https://api.github.com/users/alfredo-etched/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alfredo-etched/subscriptions", "organizations_url": "https://api.github.com/users/alfredo-etched/orgs", "repos_url": "https://api.github.com/users/alfredo-etched/repos", "events_url": "https://api.github.com/users/alfredo-etched/events{/privacy}", "received_events_url": "https://api.github.com/users/alfredo-etched/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-14T02:32:15
2025-10-20T14:18:09
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41557", "html_url": "https://github.com/huggingface/transformers/pull/41557", "diff_url": "https://github.com/huggingface/transformers/pull/41557.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41557.patch", "merged_at": null }
Something is off. The code shows Llama4TextL2Norm function name, but it computes RMS Norm. Which one is the correct one for LLAMA4? RMS or L2? If so, the function should be renamed (if RMS) or modified (if L2) accordingly. # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41557/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41557/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41556
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41556/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41556/comments
https://api.github.com/repos/huggingface/transformers/issues/41556/events
https://github.com/huggingface/transformers/pull/41556
3,512,122,051
PR_kwDOCUB6oc6tkrXP
41,556
fix check inputs for text2text pipeline
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T02:30:44
2025-10-16T11:42:41
2025-10-16T11:42:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41556", "html_url": "https://github.com/huggingface/transformers/pull/41556", "diff_url": "https://github.com/huggingface/transformers/pull/41556.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41556.patch", "merged_at": "2025-10-16T11:42:41" }
The pipeline should check generation_config before checking inputs. ```python import torch from transformers import pipeline, AutoTokenizer model_id = "sshleifer/distilbart-cnn-12-6" tokenizer = AutoTokenizer.from_pretrained(model_id) generator = pipeline( "summarization", model=model_id, tokenizer=tokenizer, dtype=torch.float16, ) generation_config = generator.model.generation_config generation_config.max_new_tokens = 4 generation_config.min_new_tokens = 4 input_sentence = "This is an example inputs for summarization" output = generator( input_sentence, generation_config=generation_config ) ``` output: ``` Traceback (most recent call last): File "/home/jiqing/HuggingFace/tests/workloads/test_summa.py", line 16, in <module> output = generator( ^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/text2text_generation.py", line 285, in __call__ return super().__call__(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/text2text_generation.py", line 182, in __call__ result = super().__call__(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/base.py", line 1289, in __call__ return self.run_single(inputs, preprocess_params, forward_params, postprocess_params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/base.py", line 1296, in run_single model_outputs = self.forward(model_inputs, **forward_params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/base.py", line 1200, in forward model_outputs = self._forward(model_inputs, **forward_params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/text2text_generation.py", line 198, in _forward self.check_inputs( File "/home/jiqing/transformers/src/transformers/pipelines/text2text_generation.py", line 291, in check_inputs if max_new_tokens < min_length: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: '<' not supported between instances of 'NoneType' and 'int' ``` This PR can fix it by getting the correct max_new_tokens in generation config.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41556/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41556/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41555
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41555/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41555/comments
https://api.github.com/repos/huggingface/transformers/issues/41555/events
https://github.com/huggingface/transformers/pull/41555
3,511,975,467
PR_kwDOCUB6oc6tkOF2
41,555
Ring attn
{ "login": "wanyaworld", "id": 18171501, "node_id": "MDQ6VXNlcjE4MTcxNTAx", "avatar_url": "https://avatars.githubusercontent.com/u/18171501?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wanyaworld", "html_url": "https://github.com/wanyaworld", "followers_url": "https://api.github.com/users/wanyaworld/followers", "following_url": "https://api.github.com/users/wanyaworld/following{/other_user}", "gists_url": "https://api.github.com/users/wanyaworld/gists{/gist_id}", "starred_url": "https://api.github.com/users/wanyaworld/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wanyaworld/subscriptions", "organizations_url": "https://api.github.com/users/wanyaworld/orgs", "repos_url": "https://api.github.com/users/wanyaworld/repos", "events_url": "https://api.github.com/users/wanyaworld/events{/privacy}", "received_events_url": "https://api.github.com/users/wanyaworld/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-14T01:07:10
2025-10-14T01:08:10
2025-10-14T01:08:10
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41555", "html_url": "https://github.com/huggingface/transformers/pull/41555", "diff_url": "https://github.com/huggingface/transformers/pull/41555.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41555.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "wanyaworld", "id": 18171501, "node_id": "MDQ6VXNlcjE4MTcxNTAx", "avatar_url": "https://avatars.githubusercontent.com/u/18171501?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wanyaworld", "html_url": "https://github.com/wanyaworld", "followers_url": "https://api.github.com/users/wanyaworld/followers", "following_url": "https://api.github.com/users/wanyaworld/following{/other_user}", "gists_url": "https://api.github.com/users/wanyaworld/gists{/gist_id}", "starred_url": "https://api.github.com/users/wanyaworld/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wanyaworld/subscriptions", "organizations_url": "https://api.github.com/users/wanyaworld/orgs", "repos_url": "https://api.github.com/users/wanyaworld/repos", "events_url": "https://api.github.com/users/wanyaworld/events{/privacy}", "received_events_url": "https://api.github.com/users/wanyaworld/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41555/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41555/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41554
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41554/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41554/comments
https://api.github.com/repos/huggingface/transformers/issues/41554/events
https://github.com/huggingface/transformers/issues/41554
3,511,764,600
I_kwDOCUB6oc7RUUZ4
41,554
model.from_pretrained( . . . ) not loading needed weights/parameters
{ "login": "lorsonblair", "id": 56801497, "node_id": "MDQ6VXNlcjU2ODAxNDk3", "avatar_url": "https://avatars.githubusercontent.com/u/56801497?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lorsonblair", "html_url": "https://github.com/lorsonblair", "followers_url": "https://api.github.com/users/lorsonblair/followers", "following_url": "https://api.github.com/users/lorsonblair/following{/other_user}", "gists_url": "https://api.github.com/users/lorsonblair/gists{/gist_id}", "starred_url": "https://api.github.com/users/lorsonblair/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lorsonblair/subscriptions", "organizations_url": "https://api.github.com/users/lorsonblair/orgs", "repos_url": "https://api.github.com/users/lorsonblair/repos", "events_url": "https://api.github.com/users/lorsonblair/events{/privacy}", "received_events_url": "https://api.github.com/users/lorsonblair/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-13T23:20:20
2025-10-22T14:12:06
null
NONE
null
null
null
null
I am performing quantization of a PatchTSTForPrediction model and attempting to load a saved quantized model for testing. Model is saved using `model.save_pretrained( . . . )`. Testing proceeds perfectly once performed immediately after QAT (Hugging face trainer's handles loading at the end of training); however, when attempting to load a saved quantized (trained) model, the error below occurs. I perform all the pre-quantization preparation so that the model contains all the necessary parameters (untrained) and then try to load the saved checkpoint. How can I force `from_pretrained( . . . )` to load ALL required weights? `Some weights of the model checkpoint at ./checkpoints/ . . . were not used when initializing PatchTSTForPrediction: ['head.projection.calib_counter', 'head.projection.num_module_called', 'head.projection.obsrv_clipval', 'head.projection.obsrv_clipvaln', 'head.projection.obsrv_w_clipval', 'head.projection.quantize_feature.clip_val', 'head.projection.quantize_feature.clip_valn', 'head.projection.quantize_weight.clip_val', 'model.encoder.layers.0.ff.0.calib_counter', 'model.encoder.layers.0.ff.0.num_module_called', 'model.encoder.layers.0.ff.0.obsrv_clipval', 'model.encoder.layers.0.ff.0.obsrv_clipvaln', 'model.encoder.layers.0.ff.0.obsrv_w_clipval', 'model.encoder.layers.0.ff.0.quantize_feature.clip_val', 'model.encoder.layers.0.ff.0.quantize_feature.clip_valn', 'model.encoder.layers.0.ff.0.quantize_weight.clip_val', 'model.encoder.layers.0.ff.3.calib_counter', 'model.encoder.layers.0.ff.3.num_module_called', 'model.encoder.layers.0.ff.3.obsrv_clipval', 'model.encoder.layers.0.ff.3.obsrv_clipvaln', 'model.encoder.layers.0.ff.3.obsrv_w_clipval', 'model.encoder.layers.0.ff.3.quantize_feature.clip_val', 'model.encoder.layers.0.ff.3.quantize_feature.clip_valn', 'model.encoder.layers.0.ff.3.quantize_weight.clip_val', 'model.encoder.layers.0.self_attn.QBmm52.num_module_called', 'model.encoder.layers.0.self_attn.QBmm52.quantize_m1.clip_val', 'model.encoder.layers.0.self_attn.QBmm52.quantize_m1.clip_valn', 'model.encoder.layers.0.self_attn.QBmm52.quantize_m2.clip_val', 'model.encoder.layers.0.self_attn.QBmm52.quantize_m2.clip_valn', 'model.encoder.layers.0.self_attn.QBmm62.num_module_called', 'model.encoder.layers.0.self_attn.QBmm62.quantize_m1.clip_val', 'model.encoder.layers.0.self_attn.QBmm62.quantize_m1.clip_valn', 'model.encoder.layers.0.self_attn.QBmm62.quantize_m2.clip_val', 'model.encoder.layers.0.self_attn.QBmm62.quantize_m2.clip_valn', 'model.encoder.layers.0.self_attn.k_proj.calib_counter', 'model.encoder.layers.0.self_attn.k_proj.num_module_called', 'model.encoder.layers.0.self_attn.k_proj.obsrv_clipval', 'model.encoder.layers.0.self_attn.k_proj.obsrv_clipvaln', 'model.encoder.layers.0.self_attn.k_proj.obsrv_w_clipval', 'model.encoder.layers.0.self_attn.k_proj.quantize_feature.clip_val', 'model.encoder.layers.0.self_attn.k_proj.quantize_feature.clip_valn', 'model.encoder.layers.0.self_attn.k_proj.quantize_weight.clip_val', 'model.encoder.layers.0.self_attn.out_proj.calib_counter', . . .] This IS expected if you are initializing PatchTSTForPrediction from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). This IS NOT expected if you are initializing PatchTSTForPrediction from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).` NB: QAT is simulated. Additional parameters are added to the model after qmodel_prep is called and QAT proceeds as normal. I am using IBM's fms-model-optimizer.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41554/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41554/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41553
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41553/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41553/comments
https://api.github.com/repos/huggingface/transformers/issues/41553/events
https://github.com/huggingface/transformers/issues/41553
3,511,681,654
I_kwDOCUB6oc7RUAJ2
41,553
Bad error message for AutoTokenizer loading Voxtral
{ "login": "jackzhxng", "id": 32371937, "node_id": "MDQ6VXNlcjMyMzcxOTM3", "avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jackzhxng", "html_url": "https://github.com/jackzhxng", "followers_url": "https://api.github.com/users/jackzhxng/followers", "following_url": "https://api.github.com/users/jackzhxng/following{/other_user}", "gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions", "organizations_url": "https://api.github.com/users/jackzhxng/orgs", "repos_url": "https://api.github.com/users/jackzhxng/repos", "events_url": "https://api.github.com/users/jackzhxng/events{/privacy}", "received_events_url": "https://api.github.com/users/jackzhxng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1990918270, "node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue", "name": "Good First Issue", "color": "bbf794", "default": false, "description": "" }, { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-10-13T22:37:26
2025-10-23T18:25:52
null
CONTRIBUTOR
null
null
null
null
### System Info Getting the following unhelpful error when trying to load Voxtral's tokenizer with `AutoTokenizer` without `mistral-common` installed. ``` ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:1144: in from_pretrained return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:2070: in from_pretrained return cls._from_pretrained( ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:2108: in _from_pretrained slow_tokenizer = (cls.slow_tokenizer_class)._from_pretrained( ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:2316: in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py:171: in __init__ self.sp_model = self.get_spm_processor(kwargs.pop("from_slow", False)) ../../.conda/envs/et_new/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama.py:198: in get_spm_processor tokenizer.Load(self.vocab_file) ../../.conda/envs/et_new/lib/python3.10/site-packages/sentencepiece/__init__.py:961: in Load return self.LoadFromFile(model_file) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <sentencepiece.SentencePieceProcessor; proxy of <Swig Object of type 'sentencepiece::SentencePieceProcessor *' at 0x7f5e7e25f780> >, arg = None def LoadFromFile(self, arg): > return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg) E TypeError: not a string ../../.conda/envs/et_new/lib/python3.10/site-packages/sentencepiece/__init__.py:316: TypeError ``` ### Who can help? @ArthurZucker ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction 1. `pip install transformers` 2. `from transformers import AutoTokenizer; AutoTokenizer.from_pretrained("mistralai/Voxtral-Mini-3B-2507")` ### Expected behavior A clearer error message, suggesting to `pip install mistral-common`
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41553/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41553/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41552
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41552/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41552/comments
https://api.github.com/repos/huggingface/transformers/issues/41552/events
https://github.com/huggingface/transformers/pull/41552
3,511,393,470
PR_kwDOCUB6oc6tiPPv
41,552
[Bugfix] Fix quantizer import
{ "login": "kylesayrs", "id": 17103692, "node_id": "MDQ6VXNlcjE3MTAzNjky", "avatar_url": "https://avatars.githubusercontent.com/u/17103692?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kylesayrs", "html_url": "https://github.com/kylesayrs", "followers_url": "https://api.github.com/users/kylesayrs/followers", "following_url": "https://api.github.com/users/kylesayrs/following{/other_user}", "gists_url": "https://api.github.com/users/kylesayrs/gists{/gist_id}", "starred_url": "https://api.github.com/users/kylesayrs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kylesayrs/subscriptions", "organizations_url": "https://api.github.com/users/kylesayrs/orgs", "repos_url": "https://api.github.com/users/kylesayrs/repos", "events_url": "https://api.github.com/users/kylesayrs/events{/privacy}", "received_events_url": "https://api.github.com/users/kylesayrs/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T20:29:00
2025-10-14T14:39:56
2025-10-14T14:36:29
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41552", "html_url": "https://github.com/huggingface/transformers/pull/41552", "diff_url": "https://github.com/huggingface/transformers/pull/41552.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41552.patch", "merged_at": null }
## Purpose ## * Fix bad import in `src/transformers/quantizers/base.py` ## Changes ## * Import `PreTrainedModel` dynamically, rather than only importing when type checking * The class is needed for an `isinstance` check ## Testing ## * Code which calls `_assign_original_dtype` which was previously failing is now passing * Can provide a minimum example if required
{ "login": "kylesayrs", "id": 17103692, "node_id": "MDQ6VXNlcjE3MTAzNjky", "avatar_url": "https://avatars.githubusercontent.com/u/17103692?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kylesayrs", "html_url": "https://github.com/kylesayrs", "followers_url": "https://api.github.com/users/kylesayrs/followers", "following_url": "https://api.github.com/users/kylesayrs/following{/other_user}", "gists_url": "https://api.github.com/users/kylesayrs/gists{/gist_id}", "starred_url": "https://api.github.com/users/kylesayrs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kylesayrs/subscriptions", "organizations_url": "https://api.github.com/users/kylesayrs/orgs", "repos_url": "https://api.github.com/users/kylesayrs/repos", "events_url": "https://api.github.com/users/kylesayrs/events{/privacy}", "received_events_url": "https://api.github.com/users/kylesayrs/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41552/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41552/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41551
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41551/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41551/comments
https://api.github.com/repos/huggingface/transformers/issues/41551/events
https://github.com/huggingface/transformers/pull/41551
3,511,024,730
PR_kwDOCUB6oc6tg-iw
41,551
upgrade xpu docker file to torch 2.8
{ "login": "yao-matrix", "id": 7245027, "node_id": "MDQ6VXNlcjcyNDUwMjc=", "avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yao-matrix", "html_url": "https://github.com/yao-matrix", "followers_url": "https://api.github.com/users/yao-matrix/followers", "following_url": "https://api.github.com/users/yao-matrix/following{/other_user}", "gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}", "starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions", "organizations_url": "https://api.github.com/users/yao-matrix/orgs", "repos_url": "https://api.github.com/users/yao-matrix/repos", "events_url": "https://api.github.com/users/yao-matrix/events{/privacy}", "received_events_url": "https://api.github.com/users/yao-matrix/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T18:16:22
2025-10-22T17:27:52
2025-10-21T08:03:04
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41551", "html_url": "https://github.com/huggingface/transformers/pull/41551", "diff_url": "https://github.com/huggingface/transformers/pull/41551.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41551.patch", "merged_at": "2025-10-21T08:03:04" }
@ydshieh , pls help review, thx very much. pass rate is 92.4% (25446 pass over 27546 total cases), we are working on the issues found during this release cycle and PRs are merged or under development. Thx for great support, XPU pass rate on transformers is improving step by step.
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41551/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41551/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41550
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41550/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41550/comments
https://api.github.com/repos/huggingface/transformers/issues/41550/events
https://github.com/huggingface/transformers/pull/41550
3,510,798,558
PR_kwDOCUB6oc6tgM4x
41,550
Fix : Add VideoInpaintPipeline for temporally-consistent diffusion-based video inpainting
{ "login": "Dibbu-cell", "id": 191691009, "node_id": "U_kgDOC2z5AQ", "avatar_url": "https://avatars.githubusercontent.com/u/191691009?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Dibbu-cell", "html_url": "https://github.com/Dibbu-cell", "followers_url": "https://api.github.com/users/Dibbu-cell/followers", "following_url": "https://api.github.com/users/Dibbu-cell/following{/other_user}", "gists_url": "https://api.github.com/users/Dibbu-cell/gists{/gist_id}", "starred_url": "https://api.github.com/users/Dibbu-cell/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Dibbu-cell/subscriptions", "organizations_url": "https://api.github.com/users/Dibbu-cell/orgs", "repos_url": "https://api.github.com/users/Dibbu-cell/repos", "events_url": "https://api.github.com/users/Dibbu-cell/events{/privacy}", "received_events_url": "https://api.github.com/users/Dibbu-cell/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 9258341780, "node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop", "name": "Code agent slop", "color": "C59579", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-10-13T17:03:46
2025-10-14T12:39:37
2025-10-14T12:39:37
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41550", "html_url": "https://github.com/huggingface/transformers/pull/41550", "diff_url": "https://github.com/huggingface/transformers/pull/41550.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41550.patch", "merged_at": null }
# What does this PR do? Adds a VideoInpaintPipeline prototype (under examples/video_inpaint) that demonstrates a Diffusers-style from_pretrained factory and a temporally-coherent inpainting workflow with optional optical-flow warping and latent-reuse, using a lightweight mock backend so tests run without large model downloads. Includes a fast unit test, README, and PR description documenting the API, acceptance criteria, and optional heavy dependencies (opencv/diffusers/RAFT/GMFlow) so reviewers can evaluate the design and merge a production implementation in follow-ups. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes #41543 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41550/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41550/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41549
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41549/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41549/comments
https://api.github.com/repos/huggingface/transformers/issues/41549/events
https://github.com/huggingface/transformers/pull/41549
3,510,775,257
PR_kwDOCUB6oc6tgHmn
41,549
🚨 Refactor DETR to updated standards
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-13T16:57:33
2025-10-20T14:01:24
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41549", "html_url": "https://github.com/huggingface/transformers/pull/41549", "diff_url": "https://github.com/huggingface/transformers/pull/41549.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41549.patch", "merged_at": null }
# What does this PR do? This PR aims at refactoring DETR as part of an effort to standardize vision models in the library, in the same vein as https://github.com/huggingface/transformers/pull/41546. Expect to see much more PRs like this for vision models as we approach v5!
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41549/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41549/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41548
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41548/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41548/comments
https://api.github.com/repos/huggingface/transformers/issues/41548/events
https://github.com/huggingface/transformers/pull/41548
3,510,628,406
PR_kwDOCUB6oc6tfnuW
41,548
[device_map] Accelerate loading by computing device_map much faster
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T16:07:25
2025-10-21T15:22:09
2025-10-15T09:18:57
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41548", "html_url": "https://github.com/huggingface/transformers/pull/41548", "diff_url": "https://github.com/huggingface/transformers/pull/41548.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41548.patch", "merged_at": "2025-10-15T09:18:57" }
# What does this PR do? If a model has a lot of different parameters (which is the case for models with a lot of experts, such as Qwen3Next), the computation of the device_map itself (yes, just the mapping of modules to devices) is EXTREMELY slow when using `accelerate`. This is because they traverse the model graph in a very non-optimized manner, resulting in quadratic complexity in the number of parameters and modules, when we could do it in linear time instead. For example, `Qwen/Qwen3-Next-80B-A3B-Instruct` has a total of `173529` parameters and intermediate modules, which is a a lot. So doing the following: ```python from transformers import AutoConfig, Qwen3NextForCausalLM from transformers.modeling_utils import _get_device_map import torch import time model_name = "Qwen/Qwen3-Next-80B-A3B-Instruct" config = AutoConfig.from_pretrained(model_name) with torch.device("meta"): model = Qwen3NextForCausalLM(config) t0 = time.time() device_map = _get_device_map(model, device_map="auto", dtype=torch.bfloat16, max_memory=None, hf_quantizer=None, keep_in_fp32_regex=None) dt = time.time() - t0 print(f"Time needed: {dt:.2f} s") >>> Time needed: 60.02 s ``` takes un UNBEARABLE `60.02` s on my Macbook Pro M3. And this is only for the device_map computation, the loading has not even started yet... This PR fixes the issue by rewriting the needed functions from `accelerate` with much better complexity and operations directly in Transformers, leading to linear time and the following: ```python >>> Time needed: 1.33 s ``` So effectively going 60x faster from 1min to 1s The device_map created are the exact same.
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41548/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41548/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41547
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41547/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41547/comments
https://api.github.com/repos/huggingface/transformers/issues/41547/events
https://github.com/huggingface/transformers/pull/41547
3,510,583,164
PR_kwDOCUB6oc6tfdtI
41,547
Fix time issue
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T15:53:49
2025-10-13T16:06:53
2025-10-13T16:06:39
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41547", "html_url": "https://github.com/huggingface/transformers/pull/41547", "diff_url": "https://github.com/huggingface/transformers/pull/41547.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41547.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41547/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41547/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41546
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41546/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41546/comments
https://api.github.com/repos/huggingface/transformers/issues/41546/events
https://github.com/huggingface/transformers/pull/41546
3,510,474,948
PR_kwDOCUB6oc6tfGBC
41,546
Modernize CLIP modeling code
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-10-13T15:21:26
2025-10-21T14:04:46
2025-10-21T14:04:43
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41546", "html_url": "https://github.com/huggingface/transformers/pull/41546", "diff_url": "https://github.com/huggingface/transformers/pull/41546.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41546.patch", "merged_at": "2025-10-21T14:04:43" }
# What does this PR do? As per title. Adds the nice recent modeling utils to CLIP. Was motivated to be able to use it in the looong-standing #33962 .
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41546/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41546/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/41545
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41545/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41545/comments
https://api.github.com/repos/huggingface/transformers/issues/41545/events
https://github.com/huggingface/transformers/pull/41545
3,510,149,897
PR_kwDOCUB6oc6td_l5
41,545
TDT for HF
{ "login": "hainan-xv", "id": 5440014, "node_id": "MDQ6VXNlcjU0NDAwMTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/5440014?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hainan-xv", "html_url": "https://github.com/hainan-xv", "followers_url": "https://api.github.com/users/hainan-xv/followers", "following_url": "https://api.github.com/users/hainan-xv/following{/other_user}", "gists_url": "https://api.github.com/users/hainan-xv/gists{/gist_id}", "starred_url": "https://api.github.com/users/hainan-xv/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hainan-xv/subscriptions", "organizations_url": "https://api.github.com/users/hainan-xv/orgs", "repos_url": "https://api.github.com/users/hainan-xv/repos", "events_url": "https://api.github.com/users/hainan-xv/events{/privacy}", "received_events_url": "https://api.github.com/users/hainan-xv/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
open
false
{ "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
[]
2025-10-13T13:46:50
2025-10-23T22:10:35
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/41545", "html_url": "https://github.com/huggingface/transformers/pull/41545", "diff_url": "https://github.com/huggingface/transformers/pull/41545.diff", "patch_url": "https://github.com/huggingface/transformers/pull/41545.patch", "merged_at": null }
# What does this PR do? Parakeet TDT model integration. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker @Cyrilvallez - vision models: @yonigozlan @molbap - audio models: @eustlb @ebezzam @vasqu - multimodal models: @zucchini-nlp - graph models: @clefourrier Library: - generate: @zucchini-nlp (visual-language models) or @gante (all others) - continuous batching: @remi-or @ArthurZucker @McPatate - pipelines: @Rocketknight1 - tokenizers: @ArthurZucker and @itazap - trainer: @zach-huggingface @SunMarc - attention: @vasqu @ArthurZucker @CyrilVallez - model loading (from pretrained, etc): @CyrilVallez - distributed: @3outeille @ArthurZucker @S1ro1 - CIs: @ydshieh Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber - kernels: @MekkCyber @drbh - peft: @BenjaminBossan @githubnemo Devices/Backends: - AMD ROCm: @ivarflakstad - Intel XPU: @IlyasMoutawwakil - Ascend NPU: @ivarflakstad Documentation: @stevhliu Research projects are not maintained and should be taken as is. -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41545/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/41545/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/41544
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41544/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41544/comments
https://api.github.com/repos/huggingface/transformers/issues/41544/events
https://github.com/huggingface/transformers/issues/41544
3,510,051,922
I_kwDOCUB6oc7RNyRS
41,544
ValueError: You current version of `autoawq` does not support module quantization skipping, please upgrade `autoawq` package to at least 0.1.8.
{ "login": "EIIvy", "id": 48089890, "node_id": "MDQ6VXNlcjQ4MDg5ODkw", "avatar_url": "https://avatars.githubusercontent.com/u/48089890?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EIIvy", "html_url": "https://github.com/EIIvy", "followers_url": "https://api.github.com/users/EIIvy/followers", "following_url": "https://api.github.com/users/EIIvy/following{/other_user}", "gists_url": "https://api.github.com/users/EIIvy/gists{/gist_id}", "starred_url": "https://api.github.com/users/EIIvy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EIIvy/subscriptions", "organizations_url": "https://api.github.com/users/EIIvy/orgs", "repos_url": "https://api.github.com/users/EIIvy/repos", "events_url": "https://api.github.com/users/EIIvy/events{/privacy}", "received_events_url": "https://api.github.com/users/EIIvy/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-10-13T13:20:14
2025-10-15T12:34:02
null
NONE
null
null
null
null
when I run, model = Qwen2_5_VLForConditionalGeneration.from_pretrained( "Qwen/Qwen2.5-VL-72B-Instruct-AWQ", torch_dtype=torch.float16, device_map="auto", ) processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-72B-Instruct-AWQ") I got this error, my package version: torch==2.5.1 transformers==4.50.0
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41544/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41544/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/41543
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/41543/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/41543/comments
https://api.github.com/repos/huggingface/transformers/issues/41543/events
https://github.com/huggingface/transformers/issues/41543
3,509,930,318
I_kwDOCUB6oc7RNUlO
41,543
Add VideoInpaintPipeline for temporally-consistent diffusion-based video inpainting
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
[]
2025-10-13T12:45:31
2025-10-17T06:37:43
2025-10-17T06:37:43
CONTRIBUTOR
null
null
null
null
### Feature request Introduce a new pipeline to extend existing image inpainting capabilities (StableDiffusionInpaintPipeline) to videos. The goal is to provide a native, GPU-optimized API within Diffusers that performs temporally coherent video inpainting instead of independent per-frame processing. ### Motivation Current video inpainting approaches in the community simply loop over frames and call the image inpainting pipeline repeatedly. This leads to: - Temporal flicker and inconsistent textures between frames. - Poor GPU utilization and high memory overhead. - Lack of tools to maintain motion coherence or reuse diffusion latents across time. A built-in VideoInpaintPipeline would make it possible to remove objects, restore scenes, or creatively edit videos using diffusion models while keeping motion and lighting consistent across frames. ### Your contribution I plan to: - Implement VideoInpaintPipeline as a subclass of DiffusionPipeline, leveraging StableDiffusionInpaintPipeline under the hood. - Add temporal consistency mechanisms, such as latent reuse between frames and optional optical-flow–guided warping (RAFT / GMFlow). - Optimize performance through batched FP16 inference, scheduler noise reuse, and optional torch.compile acceleration. Provide a clean user API compatible with existing pipelines: ``` from diffusers import VideoInpaintPipeline pipe = VideoInpaintPipeline.from_pretrained( "runwayml/stable-diffusion-inpainting", use_optical_flow=True, compile=True, ) result = pipe( video_path="input.mp4", mask_path="mask.mp4", prompt="replace background with a snowy mountain", num_inference_steps=10, ) result.video.save("output.mp4") ``` - Contribute documentation and tests demonstrating temporal coherence, performance benchmarks, and example notebooks for real-world use.
{ "login": "Aki-07", "id": 95642646, "node_id": "U_kgDOBbNkFg", "avatar_url": "https://avatars.githubusercontent.com/u/95642646?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Aki-07", "html_url": "https://github.com/Aki-07", "followers_url": "https://api.github.com/users/Aki-07/followers", "following_url": "https://api.github.com/users/Aki-07/following{/other_user}", "gists_url": "https://api.github.com/users/Aki-07/gists{/gist_id}", "starred_url": "https://api.github.com/users/Aki-07/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Aki-07/subscriptions", "organizations_url": "https://api.github.com/users/Aki-07/orgs", "repos_url": "https://api.github.com/users/Aki-07/repos", "events_url": "https://api.github.com/users/Aki-07/events{/privacy}", "received_events_url": "https://api.github.com/users/Aki-07/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/41543/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/41543/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true