url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/39937 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39937/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39937/comments | https://api.github.com/repos/huggingface/transformers/issues/39937/events | https://github.com/huggingface/transformers/pull/39937 | 3,294,444,934 | PR_kwDOCUB6oc6iRMgM | 39,937 | Add Exception when flash_attn is not found and ModernBert is used | {
"login": "Darejkal",
"id": 55143337,
"node_id": "MDQ6VXNlcjU1MTQzMzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/55143337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Darejkal",
"html_url": "https://github.com/Darejkal",
"followers_url": "https://api.github.com/users/Darejkal/followers",
"following_url": "https://api.github.com/users/Darejkal/following{/other_user}",
"gists_url": "https://api.github.com/users/Darejkal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Darejkal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Darejkal/subscriptions",
"organizations_url": "https://api.github.com/users/Darejkal/orgs",
"repos_url": "https://api.github.com/users/Darejkal/repos",
"events_url": "https://api.github.com/users/Darejkal/events{/privacy}",
"received_events_url": "https://api.github.com/users/Darejkal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T21:05:53 | 2025-08-07T19:07:18 | 2025-08-07T19:07:18 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39937",
"html_url": "https://github.com/huggingface/transformers/pull/39937",
"diff_url": "https://github.com/huggingface/transformers/pull/39937.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39937.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #39934
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Darejkal",
"id": 55143337,
"node_id": "MDQ6VXNlcjU1MTQzMzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/55143337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Darejkal",
"html_url": "https://github.com/Darejkal",
"followers_url": "https://api.github.com/users/Darejkal/followers",
"following_url": "https://api.github.com/users/Darejkal/following{/other_user}",
"gists_url": "https://api.github.com/users/Darejkal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Darejkal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Darejkal/subscriptions",
"organizations_url": "https://api.github.com/users/Darejkal/orgs",
"repos_url": "https://api.github.com/users/Darejkal/repos",
"events_url": "https://api.github.com/users/Darejkal/events{/privacy}",
"received_events_url": "https://api.github.com/users/Darejkal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39937/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39936 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39936/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39936/comments | https://api.github.com/repos/huggingface/transformers/issues/39936/events | https://github.com/huggingface/transformers/pull/39936 | 3,294,437,407 | PR_kwDOCUB6oc6iRK4- | 39,936 | fix typo | {
"login": "Tialo",
"id": 65392801,
"node_id": "MDQ6VXNlcjY1MzkyODAx",
"avatar_url": "https://avatars.githubusercontent.com/u/65392801?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tialo",
"html_url": "https://github.com/Tialo",
"followers_url": "https://api.github.com/users/Tialo/followers",
"following_url": "https://api.github.com/users/Tialo/following{/other_user}",
"gists_url": "https://api.github.com/users/Tialo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tialo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tialo/subscriptions",
"organizations_url": "https://api.github.com/users/Tialo/orgs",
"repos_url": "https://api.github.com/users/Tialo/repos",
"events_url": "https://api.github.com/users/Tialo/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tialo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T21:02:19 | 2025-08-06T16:21:57 | 2025-08-06T16:21:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39936",
"html_url": "https://github.com/huggingface/transformers/pull/39936",
"diff_url": "https://github.com/huggingface/transformers/pull/39936.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39936.patch",
"merged_at": "2025-08-06T16:21:25"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39936/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39936/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39935 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39935/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39935/comments | https://api.github.com/repos/huggingface/transformers/issues/39935/events | https://github.com/huggingface/transformers/issues/39935 | 3,294,429,243 | I_kwDOCUB6oc7EXQA7 | 39,935 | Still getting "fp16 mixed precision requires a GPU (not 'mps')." error | {
"login": "renet10",
"id": 12860891,
"node_id": "MDQ6VXNlcjEyODYwODkx",
"avatar_url": "https://avatars.githubusercontent.com/u/12860891?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/renet10",
"html_url": "https://github.com/renet10",
"followers_url": "https://api.github.com/users/renet10/followers",
"following_url": "https://api.github.com/users/renet10/following{/other_user}",
"gists_url": "https://api.github.com/users/renet10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/renet10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/renet10/subscriptions",
"organizations_url": "https://api.github.com/users/renet10/orgs",
"repos_url": "https://api.github.com/users/renet10/repos",
"events_url": "https://api.github.com/users/renet10/events{/privacy}",
"received_events_url": "https://api.github.com/users/renet10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T21:00:03 | 2025-09-08T20:51:02 | 2025-09-08T20:49:33 | CONTRIBUTOR | null | null | null | null | ### System Info
This is a follow up to issue #32648 which was closed stale without resolution.
Am still getting this error running the [ASR Task Recipe](https://huggingface.co/docs/transformers/tasks/asr).
System info:
- OS X Sequoia latest
- M2 Pro 16 GB RAM
Libraries:
- python 3.13.5
- transformers 4.52.4
- datasets 3.6.0
- torch 2.6.0
- torch-tb-profiler 0.4.3
- torchaudio 2.6.0
- torchinfo 1.8.0
- torchvision 0.21.0
Getting to the "Train" cell of the task recipe, I get:
```
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[10], line 25
1 training_args = TrainingArguments(
2 output_dir=SAVE_DIR,
3 logging_dir=LOG_DIR,
(...) 22 push_to_hub=False,
23 )
---> 25 trainer = Trainer(
26 model=model,
27 args=training_args,
28 train_dataset=encoded_minds["train"],
29 eval_dataset=encoded_minds["test"],
30 processing_class=processor,
31 data_collator=data_collator,
32 compute_metrics=compute_metrics,
33 )
35 trainer.train()
File <venv>/lib/python3.13/site-packages/transformers/utils/deprecation.py:172, in deprecate_kwarg.<locals>.wrapper.<locals>.wrapped_func(*args, **kwargs)
168 elif minimum_action in (Action.NOTIFY, Action.NOTIFY_ALWAYS) and not is_torchdynamo_compiling():
169 # DeprecationWarning is ignored by default, so we use FutureWarning instead
170 warnings.warn(message, FutureWarning, stacklevel=2)
--> 172 return func(*args, **kwargs)
File <venv>/lib/python3.13/site-packages/transformers/trainer.py:465, in Trainer.__init__(self, model, args, data_collator, train_dataset, eval_dataset, processing_class, model_init, compute_loss_func, compute_metrics, callbacks, optimizers, optimizer_cls_and_kwargs, preprocess_logits_for_metrics)
463 self.is_in_train = False
464 self.model = model
--> 465 self.create_accelerator_and_postprocess()
467 # memory metrics - must set up as early as possible
468 self._memory_tracker = TrainerMemoryTracker(self.args.skip_memory_metrics)
File <venv>/lib/python3.13/site-packages/transformers/trainer.py:5208, in Trainer.create_accelerator_and_postprocess(self)
5205 raise ValueError("Requires accelerate>1.3.0 to use Tensor Parallelism.")
5207 # create accelerator object
-> 5208 self.accelerator = Accelerator(**args)
5209 # some Trainer classes need to use `gather` instead of `gather_for_metrics`, thus we store a flag
5210 self.gather_function = self.accelerator.gather_for_metrics
File <venv>/lib/python3.13/site-packages/accelerate/accelerator.py:567, in Accelerator.__init__(self, device_placement, split_batches, mixed_precision, gradient_accumulation_steps, cpu, dataloader_config, deepspeed_plugin, fsdp_plugin, torch_tp_plugin, megatron_lm_plugin, rng_types, log_with, project_dir, project_config, gradient_accumulation_plugin, step_scheduler_with_optimizer, kwargs_handlers, dynamo_backend, dynamo_plugin, deepspeed_plugins)
556 self.native_amp = True
557 if self.device.type not in (
558 "xpu",
559 "cuda",
(...) 565 "sdaa",
566 ) or is_torch_xla_available(check_is_tpu=True):
--> 567 raise ValueError(f"fp16 mixed precision requires a GPU (not {self.device.type!r}).")
568 kwargs = self.scaler_handler.to_kwargs() if self.scaler_handler is not None else {}
570 # FSDP2 doesn't use ShardedGradScaler, don't want to modify `get_grad_scaler`, rather create a simple utility
ValueError: fp16 mixed precision requires a GPU (not 'mps').
```
I found the previously referenced issue which was closed as "Stale" in Sept 2024, with further comments that others have encountered the issue in Feb 2025. Looking through Pytorch commits it appears a PR was merged and closed Feb 2025. Is this issue now resolved in HuggingFace?
### Who can help?
@zach-huggingface @SunMarc
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. Load the [ASR Task Recipe](https://huggingface.co/docs/transformers/tasks/asr) notebook on an Apple device
2.Run all cells up to the training cell
### Expected behavior
FP16 works on MPS | {
"login": "renet10",
"id": 12860891,
"node_id": "MDQ6VXNlcjEyODYwODkx",
"avatar_url": "https://avatars.githubusercontent.com/u/12860891?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/renet10",
"html_url": "https://github.com/renet10",
"followers_url": "https://api.github.com/users/renet10/followers",
"following_url": "https://api.github.com/users/renet10/following{/other_user}",
"gists_url": "https://api.github.com/users/renet10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/renet10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/renet10/subscriptions",
"organizations_url": "https://api.github.com/users/renet10/orgs",
"repos_url": "https://api.github.com/users/renet10/repos",
"events_url": "https://api.github.com/users/renet10/events{/privacy}",
"received_events_url": "https://api.github.com/users/renet10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39935/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39935/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39934 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39934/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39934/comments | https://api.github.com/repos/huggingface/transformers/issues/39934/events | https://github.com/huggingface/transformers/issues/39934 | 3,294,422,745 | I_kwDOCUB6oc7EXObZ | 39,934 | ModernBertUnpaddedRotaryEmbedding __init__ error | {
"login": "Darejkal",
"id": 55143337,
"node_id": "MDQ6VXNlcjU1MTQzMzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/55143337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Darejkal",
"html_url": "https://github.com/Darejkal",
"followers_url": "https://api.github.com/users/Darejkal/followers",
"following_url": "https://api.github.com/users/Darejkal/following{/other_user}",
"gists_url": "https://api.github.com/users/Darejkal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Darejkal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Darejkal/subscriptions",
"organizations_url": "https://api.github.com/users/Darejkal/orgs",
"repos_url": "https://api.github.com/users/Darejkal/repos",
"events_url": "https://api.github.com/users/Darejkal/events{/privacy}",
"received_events_url": "https://api.github.com/users/Darejkal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T20:58:32 | 2025-08-07T19:07:50 | 2025-08-07T19:07:50 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.55.0
- Platform: Linux-6.6.56+-x86_64-with-glibc2.35
- Python version: 3.11.13
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.8.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.6.0+cu124 (CUDA)
- Tensorflow version (GPU?): 2.18.0 (True)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (gpu)
- Jax version: 0.5.2
- JaxLib version: 0.5.1
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: Tesla P100-PCIE-16GB
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Run
```
from transformers import ModernBertForSequenceClassification,ModernBertConfig
conf= ModernBertConfig(
**{
"architectures": [
"ModernBertForSequenceClassification"
],
"attention_bias": False,
"attention_dropout": 0.0,
"classifier_activation": "gelu",
"classifier_bias": False,
"classifier_dropout": 0.0,
"classifier_pooling": "mean",
"cls_token_id": 50281,
"decoder_bias": True,
"deterministic_flash_attn": False,
"embedding_dropout": 0.0,
"eos_token_id": 50282,
"global_attn_every_n_layers": 3,
"global_rope_theta": 160000.0,
"gradient_checkpointing": False,
"hidden_activation": "gelu",
"hidden_size": 768,
"initializer_cutoff_factor": 2.0,
"initializer_range": 0.02,
"intermediate_size": 1152,
"layer_norm_eps": 1e-05,
"local_attention": 128,
"local_rope_theta": 10000.0,
"max_position_embeddings": 8192,
"mlp_bias": False,
"mlp_dropout": 0.0,
"model_type": "modernbert",
"norm_bias": False,
"norm_eps": 1e-05,
"num_attention_heads": 12,
"num_hidden_layers": 22,
"pad_token_id": 0,
"position_embedding_type": "absolute",
"tie_word_embeddings": True,
"torch_dtype": "float32",
"transformers_version": "4.47.0.dev0",
"vocab_size": 4096
}
)
model = ModernBertForSequenceClassification(conf)
```
Returns error:
```
/usr/local/lib/python3.11/dist-packages/transformers/models/modernbert/modeling_modernbert.py in __init__(self, config, layer_id)
469 rope_theta = config.global_rope_theta
470
--> 471 if config._attn_implementation == "flash_attention_2":
472 self.rotary_emb = ModernBertUnpaddedRotaryEmbedding(
473 dim=self.head_dim, max_seqlen=max_position_embeddings, base=rope_theta
/usr/local/lib/python3.11/dist-packages/transformers/models/modernbert/modeling_modernbert.py in __init__(self, dim, base, max_seqlen, device, dtype)
155 """
156 max_seqlen: if max_seqlen, device, and dtype are provided, we precompute the cos_sin_cache
--> 157 up to max_seqlen. If the max_seqlen, device, or dtype during training/inference differ,
158 the cos_sin_cache will be recomputed during the forward pass.
159 """
```
### Expected behavior
Model successfully initialized. | {
"login": "Darejkal",
"id": 55143337,
"node_id": "MDQ6VXNlcjU1MTQzMzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/55143337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Darejkal",
"html_url": "https://github.com/Darejkal",
"followers_url": "https://api.github.com/users/Darejkal/followers",
"following_url": "https://api.github.com/users/Darejkal/following{/other_user}",
"gists_url": "https://api.github.com/users/Darejkal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Darejkal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Darejkal/subscriptions",
"organizations_url": "https://api.github.com/users/Darejkal/orgs",
"repos_url": "https://api.github.com/users/Darejkal/repos",
"events_url": "https://api.github.com/users/Darejkal/events{/privacy}",
"received_events_url": "https://api.github.com/users/Darejkal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39934/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39933 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39933/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39933/comments | https://api.github.com/repos/huggingface/transformers/issues/39933/events | https://github.com/huggingface/transformers/pull/39933 | 3,294,144,143 | PR_kwDOCUB6oc6iQLpP | 39,933 | Fix CI: Tests failing on CPU due to `torch.device('cpu').index` being None | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T19:09:03 | 2025-08-06T08:22:43 | 2025-08-06T08:22:43 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39933",
"html_url": "https://github.com/huggingface/transformers/pull/39933",
"diff_url": "https://github.com/huggingface/transformers/pull/39933.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39933.patch",
"merged_at": "2025-08-06T08:22:43"
} | `torch.device('cpu').index` is None, hence `int(routing_weights.device.index)` fails on tests being run on CPU, see example [here](https://app.circleci.com/pipelines/github/huggingface/transformers/140894/workflows/8e8620a9-fc6b-45b9-b5bd-0eb23296b587/jobs/1865476)
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39933/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39932 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39932/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39932/comments | https://api.github.com/repos/huggingface/transformers/issues/39932/events | https://github.com/huggingface/transformers/issues/39932 | 3,294,111,993 | I_kwDOCUB6oc7EWCj5 | 39,932 | transformers serve doesn't handle OPTIONS http method | {
"login": "alew3",
"id": 500714,
"node_id": "MDQ6VXNlcjUwMDcxNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/500714?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alew3",
"html_url": "https://github.com/alew3",
"followers_url": "https://api.github.com/users/alew3/followers",
"following_url": "https://api.github.com/users/alew3/following{/other_user}",
"gists_url": "https://api.github.com/users/alew3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alew3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alew3/subscriptions",
"organizations_url": "https://api.github.com/users/alew3/orgs",
"repos_url": "https://api.github.com/users/alew3/repos",
"events_url": "https://api.github.com/users/alew3/events{/privacy}",
"received_events_url": "https://api.github.com/users/alew3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T18:55:45 | 2025-08-06T14:59:54 | 2025-08-06T14:59:54 | NONE | null | null | null | null | ### System Info
I'm trying to run a model via transformers server, it's working fine via curl.
But when I try to plug an OpenWeb UI in front to connect to the server, the browsers sends first an OPTIONS http method that gets denied.
```
INFO: 127.0.0.1:43740 - "OPTIONS /v1/chat/completions HTTP/1.1" 405 Method Not Allowed
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. start
```
transformers serve
```
Run openwebui and point it to transformers serve.
```
docker run -d --rm -p 3000:8080 \
-e MODEL=openai/gpt-oss-20b \
-v open-webui:/app/backend/data \
--name open-webui ghcr.io/open-webui/open-webui:main
```
Open the browser on http://localhost:3000/
Open Open WebUI in your browser.
Go to ⚙️ Admin Settings → Connections → OpenAI Connections.
Click ➕ Add Connection.
URL: Use your server’s API endpoint (for example, http://localhost:8000/v1 for
API Key: Leave blank unless required.
Click Save.
Try to interact with the model.
### Expected behavior
it should work :-) | {
"login": "alew3",
"id": 500714,
"node_id": "MDQ6VXNlcjUwMDcxNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/500714?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alew3",
"html_url": "https://github.com/alew3",
"followers_url": "https://api.github.com/users/alew3/followers",
"following_url": "https://api.github.com/users/alew3/following{/other_user}",
"gists_url": "https://api.github.com/users/alew3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alew3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alew3/subscriptions",
"organizations_url": "https://api.github.com/users/alew3/orgs",
"repos_url": "https://api.github.com/users/alew3/repos",
"events_url": "https://api.github.com/users/alew3/events{/privacy}",
"received_events_url": "https://api.github.com/users/alew3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39932/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39932/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39931 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39931/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39931/comments | https://api.github.com/repos/huggingface/transformers/issues/39931/events | https://github.com/huggingface/transformers/pull/39931 | 3,293,984,090 | PR_kwDOCUB6oc6iPphx | 39,931 | Registers StaticCache serialization functions for torch.export.export | {
"login": "xadupre",
"id": 22452781,
"node_id": "MDQ6VXNlcjIyNDUyNzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xadupre",
"html_url": "https://github.com/xadupre",
"followers_url": "https://api.github.com/users/xadupre/followers",
"following_url": "https://api.github.com/users/xadupre/following{/other_user}",
"gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xadupre/subscriptions",
"organizations_url": "https://api.github.com/users/xadupre/orgs",
"repos_url": "https://api.github.com/users/xadupre/repos",
"events_url": "https://api.github.com/users/xadupre/events{/privacy}",
"received_events_url": "https://api.github.com/users/xadupre/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T18:04:51 | 2025-08-22T09:05:44 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39931",
"html_url": "https://github.com/huggingface/transformers/pull/39931",
"diff_url": "https://github.com/huggingface/transformers/pull/39931.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39931.patch",
"merged_at": null
} | # What does this PR do?
Registers serialization functions for StaticCache. After this, torch.export.export does not need extra work to export a model using a StaticCache.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39931/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39931/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39930 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39930/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39930/comments | https://api.github.com/repos/huggingface/transformers/issues/39930/events | https://github.com/huggingface/transformers/pull/39930 | 3,293,801,561 | PR_kwDOCUB6oc6iPCKT | 39,930 | Add missing special token properties to MistralCommonTokenizer | {
"login": "hqkqn32",
"id": 97386924,
"node_id": "U_kgDOBc4BrA",
"avatar_url": "https://avatars.githubusercontent.com/u/97386924?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hqkqn32",
"html_url": "https://github.com/hqkqn32",
"followers_url": "https://api.github.com/users/hqkqn32/followers",
"following_url": "https://api.github.com/users/hqkqn32/following{/other_user}",
"gists_url": "https://api.github.com/users/hqkqn32/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hqkqn32/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hqkqn32/subscriptions",
"organizations_url": "https://api.github.com/users/hqkqn32/orgs",
"repos_url": "https://api.github.com/users/hqkqn32/repos",
"events_url": "https://api.github.com/users/hqkqn32/events{/privacy}",
"received_events_url": "https://api.github.com/users/hqkqn32/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T17:00:54 | 2025-10-23T14:42:56 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39930",
"html_url": "https://github.com/huggingface/transformers/pull/39930",
"diff_url": "https://github.com/huggingface/transformers/pull/39930.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39930.patch",
"merged_at": null
} | - Add all_special_ids property for vLLM compatibility
- Add all_special_tokens property
- Add all_special_tokens_extended property
- Fixes AttributeError when using with vLLM
# What does this PR do?
Fixes AttributeError when using `MistralCommonTokenizer` with vLLM by adding missing special token properties.
vLLM expects tokenizers to have `all_special_tokens`, `all_special_ids`, and `all_special_tokens_extended` properties, but `MistralCommonTokenizer` was missing these, causing:
AttributeError: 'MistralCommonTokenizer' object has no attribute 'all_special_tokens'
Fixes #39841
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [X] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39930/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39930/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39929 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39929/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39929/comments | https://api.github.com/repos/huggingface/transformers/issues/39929/events | https://github.com/huggingface/transformers/pull/39929 | 3,293,754,442 | PR_kwDOCUB6oc6iO3zO | 39,929 | [CI] post-`GptOss` fixes for green CI | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-08-05T16:47:07 | 2025-08-07T12:57:46 | 2025-08-05T18:04:59 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39929",
"html_url": "https://github.com/huggingface/transformers/pull/39929",
"diff_url": "https://github.com/huggingface/transformers/pull/39929.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39929.patch",
"merged_at": "2025-08-05T18:04:59"
} | # What does this PR do?
- breaks copy on MoE loss (losses were updated in #39923, discussed on slack)
- renames `OpenAIMoE` -> `GptOss` in the model docs
- a few other minor nits to make CI happy | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39929/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39929/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39928 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39928/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39928/comments | https://api.github.com/repos/huggingface/transformers/issues/39928/events | https://github.com/huggingface/transformers/pull/39928 | 3,293,743,810 | PR_kwDOCUB6oc6iO1eJ | 39,928 | Fix hidden torchvision>=0.15 dependency issue | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T16:43:57 | 2025-08-13T15:13:43 | 2025-08-13T15:13:42 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39928",
"html_url": "https://github.com/huggingface/transformers/pull/39928",
"diff_url": "https://github.com/huggingface/transformers/pull/39928.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39928.patch",
"merged_at": "2025-08-13T15:13:42"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/39907 | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39928/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39928/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39927 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39927/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39927/comments | https://api.github.com/repos/huggingface/transformers/issues/39927/events | https://github.com/huggingface/transformers/pull/39927 | 3,293,713,957 | PR_kwDOCUB6oc6iOvBH | 39,927 | [docs] ko toc fix | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T16:33:24 | 2025-08-06T11:07:38 | 2025-08-06T10:12:34 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39927",
"html_url": "https://github.com/huggingface/transformers/pull/39927",
"diff_url": "https://github.com/huggingface/transformers/pull/39927.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39927.patch",
"merged_at": "2025-08-06T10:12:34"
} | # What does this PR do?
Restores green CI status after merging #39535
👉 check docs locally: `doc-builder build transformers docs/source/ko/ --language ko --clean`
👉 check docs in CI: comment `build-doc` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39927/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39927/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39926 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39926/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39926/comments | https://api.github.com/repos/huggingface/transformers/issues/39926/events | https://github.com/huggingface/transformers/pull/39926 | 3,293,704,055 | PR_kwDOCUB6oc6iOs3D | 39,926 | remove `triton_kernels` dep with `kernels` instead | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-08-05T16:30:03 | 2025-08-08T10:27:27 | 2025-08-06T17:31:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39926",
"html_url": "https://github.com/huggingface/transformers/pull/39926",
"diff_url": "https://github.com/huggingface/transformers/pull/39926.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39926.patch",
"merged_at": "2025-08-06T17:31:20"
} | # What does this PR do?
This PR remove `triton_kernels` dependency with `kernels` instead when using the new oai model gpt-oss | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39926/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39926/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39925 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39925/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39925/comments | https://api.github.com/repos/huggingface/transformers/issues/39925/events | https://github.com/huggingface/transformers/pull/39925 | 3,293,630,746 | PR_kwDOCUB6oc6iOcy_ | 39,925 | gpt_oss last chat template changes | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T16:07:43 | 2025-08-05T16:08:57 | 2025-08-05T16:08:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39925",
"html_url": "https://github.com/huggingface/transformers/pull/39925",
"diff_url": "https://github.com/huggingface/transformers/pull/39925.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39925.patch",
"merged_at": "2025-08-05T16:08:08"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39925/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39925/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39924 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39924/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39924/comments | https://api.github.com/repos/huggingface/transformers/issues/39924/events | https://github.com/huggingface/transformers/pull/39924 | 3,293,622,380 | PR_kwDOCUB6oc6iOa7l | 39,924 | Add chat template tests | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T16:05:23 | 2025-08-12T15:33:42 | 2025-08-12T15:33:42 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39924",
"html_url": "https://github.com/huggingface/transformers/pull/39924",
"diff_url": "https://github.com/huggingface/transformers/pull/39924.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39924.patch",
"merged_at": null
} | Add tests | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39924/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39923 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39923/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39923/comments | https://api.github.com/repos/huggingface/transformers/issues/39923/events | https://github.com/huggingface/transformers/pull/39923 | 3,293,608,972 | PR_kwDOCUB6oc6iOX8L | 39,923 | Add GPT OSS model from OpenAI | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 2627272588,
"node_id": "MDU6TGFiZWwyNjI3MjcyNTg4",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Model%20Parallel",
"name": "Model Parallel",
"color": "8B66A5",
"default": false,
"description": "Model Parallelilsm Implementations"
},
{
"id": 6202871275,
"node_id": "LA_kwDOCUB6oc8AAAABcbhN6w",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flash%20Attention",
"name": "Flash Attention",
"color": "201FF8",
"default": false,
"description": ""
},
{
"id": 7510456769,
"node_id": "LA_kwDOCUB6oc8AAAABv6h5wQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Mixture%20of%20Experts",
"name": "Mixture of Experts",
"color": "DDB5D0",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-05T16:01:44 | 2025-08-05T16:50:00 | 2025-08-05T16:02:18 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39923",
"html_url": "https://github.com/huggingface/transformers/pull/39923",
"diff_url": "https://github.com/huggingface/transformers/pull/39923.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39923.patch",
"merged_at": "2025-08-05T16:02:18"
} | ADD THE MODEL!!!!! | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39923/reactions",
"total_count": 14,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 12,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
} | https://api.github.com/repos/huggingface/transformers/issues/39923/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39922 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39922/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39922/comments | https://api.github.com/repos/huggingface/transformers/issues/39922/events | https://github.com/huggingface/transformers/pull/39922 | 3,293,597,149 | PR_kwDOCUB6oc6iOVjQ | 39,922 | 🌐 [i18n-KO] Translated `attention_interface.md` to Korean | {
"login": "songi104",
"id": 121084916,
"node_id": "U_kgDOBzeb9A",
"avatar_url": "https://avatars.githubusercontent.com/u/121084916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/songi104",
"html_url": "https://github.com/songi104",
"followers_url": "https://api.github.com/users/songi104/followers",
"following_url": "https://api.github.com/users/songi104/following{/other_user}",
"gists_url": "https://api.github.com/users/songi104/gists{/gist_id}",
"starred_url": "https://api.github.com/users/songi104/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/songi104/subscriptions",
"organizations_url": "https://api.github.com/users/songi104/orgs",
"repos_url": "https://api.github.com/users/songi104/repos",
"events_url": "https://api.github.com/users/songi104/events{/privacy}",
"received_events_url": "https://api.github.com/users/songi104/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T15:55:31 | 2025-08-12T05:16:33 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39922",
"html_url": "https://github.com/huggingface/transformers/pull/39922",
"diff_url": "https://github.com/huggingface/transformers/pull/39922.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39922.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `attention_interface.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
@yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39922/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39922/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39921 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39921/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39921/comments | https://api.github.com/repos/huggingface/transformers/issues/39921/events | https://github.com/huggingface/transformers/issues/39921 | 3,293,398,290 | I_kwDOCUB6oc7ETUUS | 39,921 | [Gemma3N] Not able to add new special tokens to model/tokenizer due to projection error | {
"login": "debasisdwivedy",
"id": 20119757,
"node_id": "MDQ6VXNlcjIwMTE5NzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/20119757?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/debasisdwivedy",
"html_url": "https://github.com/debasisdwivedy",
"followers_url": "https://api.github.com/users/debasisdwivedy/followers",
"following_url": "https://api.github.com/users/debasisdwivedy/following{/other_user}",
"gists_url": "https://api.github.com/users/debasisdwivedy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/debasisdwivedy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/debasisdwivedy/subscriptions",
"organizations_url": "https://api.github.com/users/debasisdwivedy/orgs",
"repos_url": "https://api.github.com/users/debasisdwivedy/repos",
"events_url": "https://api.github.com/users/debasisdwivedy/events{/privacy}",
"received_events_url": "https://api.github.com/users/debasisdwivedy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1834081910,
"node_id": "MDU6TGFiZWwxODM0MDgxOTEw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Usage",
"name": "Usage",
"color": "e28436",
"default": false,
"description": "General questions about the library"
},
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | open | false | null | [] | null | [] | 2025-08-05T14:43:37 | 2025-08-19T19:37:39 | null | NONE | null | null | null | null | ### System Info
```
- transformers==4.54.1
- Platform: Linux-5.15.0-1084-aws-x86_64-with-glibc2.31
- Python version: 3.13
- TRL version: 0.19.1
- Huggingface_hub version: 0.33.4
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
```
Hi,
The transformers model class for 'gemma-3n` has issues as below (pasting stacktrace):
```
trainer.train()
~~~~~~~~~~~~~^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 2237, in train
return inner_training_loop(
args=args,
...<2 lines>...
ignore_keys_for_eval=ignore_keys_for_eval,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 2578, in _inner_training_loop
tr_loss_step = self.training_step(model, inputs, num_items_in_batch)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/trl/trainer/sft_trainer.py", line 914, in training_step
return super().training_step(*args, **kwargs)
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 3792, in training_step
loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/trl/trainer/sft_trainer.py", line 868, in compute_loss
(loss, outputs) = super().compute_loss(
~~~~~~~~~~~~~~~~~~~~^
model, inputs, return_outputs=True, num_items_in_batch=num_items_in_batch
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 3879, in compute_loss
outputs = model(**inputs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/accelerate/utils/operations.py", line 818, in forward
return model_forward(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/accelerate/utils/operations.py", line 806, in __call__
return convert_to_fp32(self.model_forward(*args, **kwargs))
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/amp/autocast_mode.py", line 44, in decorate_autocast
return func(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/peft/peft_model.py", line 1850, in forward
return self.base_model(
~~~~~~~~~~~~~~~^
input_ids=input_ids,
^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
)
^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/peft/tuners/tuners_utils.py", line 222, in forward
return self.model.forward(*args, **kwargs)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 2276, in forward
outputs = self.model(
input_ids=input_ids,
...<14 lines>...
**lm_kwargs,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 2115, in forward
outputs = self.language_model(
input_ids=None,
...<10 lines>...
**lm_kwargs,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 1608, in forward
per_layer_inputs = self.project_per_layer_inputs(inputs_embeds, per_layer_inputs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 1733, in project_per_layer_inputs
per_layer_projection: torch.Tensor = self.per_layer_model_projection(inputs_embeds)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/linear.py", line 125, in forward
return F.linear(input, self.weight, self.bias)
~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: CUDA error: CUBLAS_STATUS_INTERNAL_ERROR when calling `cublasGemmEx( handle, opa, opb, m, n, k, &falpha, a, CUDA_R_16BF, lda, b, CUDA_R_16BF, ldb, &fbeta, c, CUDA_R_16BF, ldc, compute_type, CUBLAS_GEMM_DEFAULT_TENSOR_OP)`
```
The path to the file: **.venv/lib/python3.12/site-packages/transformers/models/gemma3n/modeling_gemma3n.py**
Below is the code:
```
from transformers.models.gemma3n import Gemma3nForCausalLM,Gemma3nForConditionalGeneration
# 1. Load the model and tokenizer
model = Gemma3nForCausalLM.from_pretrained(
model_name,
attn_implementation='eager',
token=os.getenv("HUGGINGFACE_TOKEN"),
device_map='cuda',
torch_dtype=torch.bfloat16
)
# Use AutoTokenizer to add special tokens
tokenizer = AutoTokenizer.from_pretrained(
model_name,
pad_token="<pad>",
eos_token="<eos>",
additional_special_tokens = ["<special1>", "<special2>"]
)
model.resize_token_embeddings(len(tokenizer))
print(f"Model embeddings resized to {len(tokenizer)}")
peft_config = LoraConfig(
r=32,
lora_alpha=64,
lora_dropout=0.05,
target_modules=["q_proj", "v_proj"], # Or other LoRA target modules
modules_to_save=["embed_tokens", "lm_head"], # This is the key
task_type=TaskType.CAUSAL_LM
)
dataset = load_dataset(dataset_name)
training_arguments = TrainingArguments(
output_dir="./your_output_dir",
per_device_train_batch_size=4,
num_train_epochs=3,
# Other arguments
)
trainer = SFTTrainer(
model=model,
args=training_arguments,
train_dataset=dataset["train"],
eval_dataset=dataset["test"],
tokenizer=tokenizer, # Use the standard tokenizer argument
peft_config=peft_config,
)
trainer.train()
```
The `modeling_gemma3n.py` has a lots of issues. The embedding is not able to be projected to the layers.
I have added the stacktrace with details above.
Regards
@ArthurZucker @zach-huggingface @SunMarc
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Please use the code below:
```
from transformers.models.gemma3n import Gemma3nForCausalLM,Gemma3nForConditionalGeneration
# 1. Load the model and tokenizer
model = Gemma3nForCausalLM.from_pretrained(
model_name,
attn_implementation='eager',
token=os.getenv("HUGGINGFACE_TOKEN"),
device_map='cuda',
torch_dtype=torch.bfloat16
)
# Use AutoTokenizer to add special tokens
tokenizer = AutoTokenizer.from_pretrained(
model_name,
pad_token="<pad>",
eos_token="<eos>",
additional_special_tokens = ["<special1>", "<special2>"]
)
model.resize_token_embeddings(len(tokenizer))
print(f"Model embeddings resized to {len(tokenizer)}")
peft_config = LoraConfig(
r=32,
lora_alpha=64,
lora_dropout=0.05,
target_modules=["q_proj", "v_proj"], # Or other LoRA target modules
modules_to_save=["embed_tokens", "lm_head"], # This is the key
task_type=TaskType.CAUSAL_LM
)
dataset = load_dataset(dataset_name)
training_arguments = TrainingArguments(
output_dir="./your_output_dir",
per_device_train_batch_size=4,
num_train_epochs=3,
# Other arguments
)
trainer = SFTTrainer(
model=model,
args=training_arguments,
train_dataset=dataset["train"],
eval_dataset=dataset["test"],
tokenizer=tokenizer, # Use the standard tokenizer argument
peft_config=peft_config,
)
trainer.train()
```
The issue is with `transformers/models/gemma3n/modeling_gemma3n.py`
### Expected behavior
The trainer should be able to train the model but i get an exception as below:
```
trainer.train()
~~~~~~~~~~~~~^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 2237, in train
return inner_training_loop(
args=args,
...<2 lines>...
ignore_keys_for_eval=ignore_keys_for_eval,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 2578, in _inner_training_loop
tr_loss_step = self.training_step(model, inputs, num_items_in_batch)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/trl/trainer/sft_trainer.py", line 914, in training_step
return super().training_step(*args, **kwargs)
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 3792, in training_step
loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/trl/trainer/sft_trainer.py", line 868, in compute_loss
(loss, outputs) = super().compute_loss(
~~~~~~~~~~~~~~~~~~~~^
model, inputs, return_outputs=True, num_items_in_batch=num_items_in_batch
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/trainer.py", line 3879, in compute_loss
outputs = model(**inputs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/accelerate/utils/operations.py", line 818, in forward
return model_forward(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/accelerate/utils/operations.py", line 806, in __call__
return convert_to_fp32(self.model_forward(*args, **kwargs))
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/amp/autocast_mode.py", line 44, in decorate_autocast
return func(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/peft/peft_model.py", line 1850, in forward
return self.base_model(
~~~~~~~~~~~~~~~^
input_ids=input_ids,
^^^^^^^^^^^^^^^^^^^^
...<6 lines>...
**kwargs,
^^^^^^^^^
)
^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/peft/tuners/tuners_utils.py", line 222, in forward
return self.model.forward(*args, **kwargs)
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 2276, in forward
outputs = self.model(
input_ids=input_ids,
...<14 lines>...
**lm_kwargs,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 2115, in forward
outputs = self.language_model(
input_ids=None,
...<10 lines>...
**lm_kwargs,
)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/utils/generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 1608, in forward
per_layer_inputs = self.project_per_layer_inputs(inputs_embeds, per_layer_inputs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/transformers/models/gemma3n/modeling_gemma3n.py", line 1733, in project_per_layer_inputs
per_layer_projection: torch.Tensor = self.per_layer_model_projection(inputs_embeds)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/teamspace/studios/this_studio/.venv/lib/python3.13/site-packages/torch/nn/modules/linear.py", line 125, in forward
return F.linear(input, self.weight, self.bias)
~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: CUDA error: CUBLAS_STATUS_INTERNAL_ERROR when calling `cublasGemmEx( handle, opa, opb, m, n, k, &falpha, a, CUDA_R_16BF, lda, b, CUDA_R_16BF, ldb, &fbeta, c, CUDA_R_16BF, ldc, compute_type, CUBLAS_GEMM_DEFAULT_TENSOR_OP)
```
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39921/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39921/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39920 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39920/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39920/comments | https://api.github.com/repos/huggingface/transformers/issues/39920/events | https://github.com/huggingface/transformers/pull/39920 | 3,293,321,647 | PR_kwDOCUB6oc6iNbLL | 39,920 | 🌐 [i18n-KO] Updated ko/perf_train_special.md | {
"login": "D15M4S",
"id": 122260287,
"node_id": "U_kgDOB0mLPw",
"avatar_url": "https://avatars.githubusercontent.com/u/122260287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/D15M4S",
"html_url": "https://github.com/D15M4S",
"followers_url": "https://api.github.com/users/D15M4S/followers",
"following_url": "https://api.github.com/users/D15M4S/following{/other_user}",
"gists_url": "https://api.github.com/users/D15M4S/gists{/gist_id}",
"starred_url": "https://api.github.com/users/D15M4S/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/D15M4S/subscriptions",
"organizations_url": "https://api.github.com/users/D15M4S/orgs",
"repos_url": "https://api.github.com/users/D15M4S/repos",
"events_url": "https://api.github.com/users/D15M4S/events{/privacy}",
"received_events_url": "https://api.github.com/users/D15M4S/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T14:20:51 | 2025-10-14T06:22:49 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39920",
"html_url": "https://github.com/huggingface/transformers/pull/39920",
"diff_url": "https://github.com/huggingface/transformers/pull/39920.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39920.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Updated ko/perf_train_special.md" 으로 부탁드립니다 -->
# What does this PR do?
Updated the perf_train_special.md file in the documentation with a Korean translation.
Thank you in advance for your review.
Part of #20179
Update of #34590
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39920/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39919 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39919/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39919/comments | https://api.github.com/repos/huggingface/transformers/issues/39919/events | https://github.com/huggingface/transformers/pull/39919 | 3,293,319,498 | PR_kwDOCUB6oc6iNatT | 39,919 | Fix gemma3n feature extractor's incorrect squeeze | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T14:20:13 | 2025-08-07T10:34:33 | 2025-08-07T10:34:28 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39919",
"html_url": "https://github.com/huggingface/transformers/pull/39919",
"diff_url": "https://github.com/huggingface/transformers/pull/39919.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39919.patch",
"merged_at": "2025-08-07T10:34:28"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #39911
- Only squeeze batch_size dimension for `mel_spectrogram` in gemma3n processor
TODO:
- [ ] Add test
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
cc @NickLucche Can you check if this fix works on your gemma3n PR? Thanks!
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39919/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39919/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39918 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39918/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39918/comments | https://api.github.com/repos/huggingface/transformers/issues/39918/events | https://github.com/huggingface/transformers/pull/39918 | 3,293,267,576 | PR_kwDOCUB6oc6iNPjB | 39,918 | Avoid `utils/check_bad_commit.py` failing due to rate limit (requesting `api.github.com`) | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T14:05:36 | 2025-08-05T19:52:22 | 2025-08-05T19:52:20 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39918",
"html_url": "https://github.com/huggingface/transformers/pull/39918",
"diff_url": "https://github.com/huggingface/transformers/pull/39918.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39918.patch",
"merged_at": "2025-08-05T19:52:20"
} | # What does this PR do?
[39885](https://github.com/huggingface/transformers/pull/39885) introduces ~70 new failed tests, and the `check_bad_commit.py` failed due to rate limit (I believe), see
https://github.com/huggingface/transformers/actions/runs/16739364088/job/47406733586
This PR stores the results of `api.github.com` requests, and reuse them if a commit is already in the cache. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39918/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39918/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39917 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39917/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39917/comments | https://api.github.com/repos/huggingface/transformers/issues/39917/events | https://github.com/huggingface/transformers/pull/39917 | 3,293,136,849 | PR_kwDOCUB6oc6iMy3t | 39,917 | 🌐 [i18n-KO] Updated ko/perf_train_cpu.md | {
"login": "D15M4S",
"id": 122260287,
"node_id": "U_kgDOB0mLPw",
"avatar_url": "https://avatars.githubusercontent.com/u/122260287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/D15M4S",
"html_url": "https://github.com/D15M4S",
"followers_url": "https://api.github.com/users/D15M4S/followers",
"following_url": "https://api.github.com/users/D15M4S/following{/other_user}",
"gists_url": "https://api.github.com/users/D15M4S/gists{/gist_id}",
"starred_url": "https://api.github.com/users/D15M4S/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/D15M4S/subscriptions",
"organizations_url": "https://api.github.com/users/D15M4S/orgs",
"repos_url": "https://api.github.com/users/D15M4S/repos",
"events_url": "https://api.github.com/users/D15M4S/events{/privacy}",
"received_events_url": "https://api.github.com/users/D15M4S/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T13:31:28 | 2025-08-09T04:52:53 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39917",
"html_url": "https://github.com/huggingface/transformers/pull/39917",
"diff_url": "https://github.com/huggingface/transformers/pull/39917.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39917.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Updated ko/perf_train_cpu.md" 으로 부탁드립니다 -->
# What does this PR do?
Updated the perf_train_cpu.md file in the documentation with a Korean translation.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
Update of #24911
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? -->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39917/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39917/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39916 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39916/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39916/comments | https://api.github.com/repos/huggingface/transformers/issues/39916/events | https://github.com/huggingface/transformers/issues/39916 | 3,292,956,450 | I_kwDOCUB6oc7ERoci | 39,916 | When using batch_eval_metrics, inputs are not gathered from different device, which is wrong behavior | {
"login": "yuanyifang",
"id": 20179859,
"node_id": "MDQ6VXNlcjIwMTc5ODU5",
"avatar_url": "https://avatars.githubusercontent.com/u/20179859?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanyifang",
"html_url": "https://github.com/yuanyifang",
"followers_url": "https://api.github.com/users/yuanyifang/followers",
"following_url": "https://api.github.com/users/yuanyifang/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanyifang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanyifang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanyifang/subscriptions",
"organizations_url": "https://api.github.com/users/yuanyifang/orgs",
"repos_url": "https://api.github.com/users/yuanyifang/repos",
"events_url": "https://api.github.com/users/yuanyifang/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanyifang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T12:37:40 | 2025-10-12T08:03:12 | 2025-10-12T08:03:12 | NONE | null | null | null | null | ### System Info
transformer version:4.51.2
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. config the batch_eval_metrics
2. using multiple node to train a model
3. writing a customer evaluate metric which using inputs
4. print the inputs, the inputs result is the runining process inputs not gatherd inputs
`if inputs_decode is not None:
inputs_decode = self.accelerator.pad_across_processes(inputs_decode, dim=1, pad_index=-100)
inputs_decode = self.gather_function(inputs_decode)
if not self.args.batch_eval_metrics or description == "Prediction":
all_inputs.add(inputs_decode)
.........
if self.args.batch_eval_metrics:
if self.compute_metrics is not None and logits is not None and labels is not None:
is_last_step = self.accelerator.gradient_state.end_of_dataloader
batch_kwargs = {}
batch_kwargs["losses"] = losses if "loss" in args.include_for_metrics else None
batch_kwargs["inputs"] = inputs if "inputs" in args.include_for_metrics else None
metrics = self.compute_metrics(
EvalPrediction(predictions=logits, label_ids=labels, **batch_kwargs),
compute_result=is_last_step,
)
del losses, logits, labels, inputs
torch.cuda.empty_cache()`
### Expected behavior
EvalPrediction shoud have all the inputs and logits from all the training process | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39916/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39915 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39915/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39915/comments | https://api.github.com/repos/huggingface/transformers/issues/39915/events | https://github.com/huggingface/transformers/pull/39915 | 3,292,916,306 | PR_kwDOCUB6oc6iMDWX | 39,915 | Fix broken image inference for Fuyu model | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-08-05T12:24:54 | 2025-08-08T07:34:43 | 2025-08-08T07:21:50 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39915",
"html_url": "https://github.com/huggingface/transformers/pull/39915",
"diff_url": "https://github.com/huggingface/transformers/pull/39915.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39915.patch",
"merged_at": "2025-08-08T07:21:50"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
When updating Transformers version in vLLM CI, we found fuyu's image inference is broken with gibberish:
https://buildkite.com/vllm/ci/builds/25473#01985c4d-d267-408b-87f5-5d77ae09d90c
- This PR fixed broken fuyu models to make sure image features are image_patches handled correctly.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39915/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39915/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39914 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39914/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39914/comments | https://api.github.com/repos/huggingface/transformers/issues/39914/events | https://github.com/huggingface/transformers/issues/39914 | 3,292,855,618 | I_kwDOCUB6oc7ERP1C | 39,914 | Idefics 3: shape mismatch: value tensor of shape [256, 576] cannot be broadcast to indexing result of shape [192, 576] | {
"login": "ezyzzy",
"id": 80557893,
"node_id": "MDQ6VXNlcjgwNTU3ODkz",
"avatar_url": "https://avatars.githubusercontent.com/u/80557893?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ezyzzy",
"html_url": "https://github.com/ezyzzy",
"followers_url": "https://api.github.com/users/ezyzzy/followers",
"following_url": "https://api.github.com/users/ezyzzy/following{/other_user}",
"gists_url": "https://api.github.com/users/ezyzzy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ezyzzy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ezyzzy/subscriptions",
"organizations_url": "https://api.github.com/users/ezyzzy/orgs",
"repos_url": "https://api.github.com/users/ezyzzy/repos",
"events_url": "https://api.github.com/users/ezyzzy/events{/privacy}",
"received_events_url": "https://api.github.com/users/ezyzzy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T12:06:24 | 2025-09-13T08:02:27 | 2025-09-13T08:02:27 | NONE | null | null | null | null | ### System Info
Hello everyone, I encountered an error while fine-tuning SmolVLM-256M on my custom dataset. I noticed there are similar issues on Idefics 2 mentioned before (https://github.com/huggingface/transformers/issues/31380), but previous experiences couldn't help me solve this problem on Idefics 3.
Could plz anyone help me with this?
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```
[----> 2 trainer.train()
File [/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2245](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2245), in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
[2243](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2243) hf_hub_utils.enable_progress_bars()
[2244](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2244) else:
-> [2245](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2245) return inner_training_loop(
[2246](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2246) args=args,
[2247](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2247) resume_from_checkpoint=resume_from_checkpoint,
[2248](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2248) trial=trial,
[2249](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2249) ignore_keys_for_eval=ignore_keys_for_eval,
[2250](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2250) )
File [/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2560](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2560), in Trainer._inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
[2553](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2553) context = (
[2554](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2554) functools.partial(self.accelerator.no_sync, model=model)
[2555](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2555) if i != len(batch_samples) - 1
[2556](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2556) and self.accelerator.distributed_type != DistributedType.DEEPSPEED
[2557](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2557) else contextlib.nullcontext
[2558](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/trainer.py:2558) )
...
[818](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/models/idefics3/modeling_idefics3.py:818) reshaped_image_hidden_states = reshaped_image_hidden_states.to(inputs_embeds.device, inputs_embeds.dtype)
--> [819](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/models/idefics3/modeling_idefics3.py:819) new_inputs_embeds[special_image_token_mask] = reshaped_image_hidden_states
[820](https://vscode-remote+icube-002bicube.vscode-resource.vscode-cdn.net/usr/local/lib/python3.11/dist-packages/transformers/models/idefics3/modeling_idefics3.py:820) return new_inputs_embeds
RuntimeError: shape mismatch: value tensor of shape [256, 576] cannot be broadcast to indexing result of shape [192, 576]
### Expected behavior
solve it !!! | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39914/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39914/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39913 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39913/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39913/comments | https://api.github.com/repos/huggingface/transformers/issues/39913/events | https://github.com/huggingface/transformers/pull/39913 | 3,292,766,052 | PR_kwDOCUB6oc6iLjRA | 39,913 | 🌐 [i18n-KO] Translated `tiny_agents.md` to Korean | {
"login": "AhnJoonSung",
"id": 53860803,
"node_id": "MDQ6VXNlcjUzODYwODAz",
"avatar_url": "https://avatars.githubusercontent.com/u/53860803?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AhnJoonSung",
"html_url": "https://github.com/AhnJoonSung",
"followers_url": "https://api.github.com/users/AhnJoonSung/followers",
"following_url": "https://api.github.com/users/AhnJoonSung/following{/other_user}",
"gists_url": "https://api.github.com/users/AhnJoonSung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AhnJoonSung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AhnJoonSung/subscriptions",
"organizations_url": "https://api.github.com/users/AhnJoonSung/orgs",
"repos_url": "https://api.github.com/users/AhnJoonSung/repos",
"events_url": "https://api.github.com/users/AhnJoonSung/events{/privacy}",
"received_events_url": "https://api.github.com/users/AhnJoonSung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T11:39:41 | 2025-08-13T05:54:17 | 2025-08-13T05:54:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39913",
"html_url": "https://github.com/huggingface/transformers/pull/39913",
"diff_url": "https://github.com/huggingface/transformers/pull/39913.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39913.patch",
"merged_at": "2025-08-13T05:54:16"
} | # What does this PR do?
Translated the `tiny_agents.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
@yijun-lee, @songi104, @chhaewxn, @jihyun-0611, @seopp, @pyapyapya
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review? (Final)
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39913/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39913/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39912 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39912/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39912/comments | https://api.github.com/repos/huggingface/transformers/issues/39912/events | https://github.com/huggingface/transformers/pull/39912 | 3,292,588,606 | PR_kwDOCUB6oc6iK8d2 | 39,912 | Revert "remove dtensors, not explicit (#39840)" | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T10:44:56 | 2025-08-05T13:12:16 | 2025-08-05T13:12:14 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39912",
"html_url": "https://github.com/huggingface/transformers/pull/39912",
"diff_url": "https://github.com/huggingface/transformers/pull/39912.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39912.patch",
"merged_at": "2025-08-05T13:12:14"
} | This did not work with generation (lm_head needs extra care!) This reverts commit 6dfd561d9cd722dfc09f702355518c6d09b9b4e3.
cc @SunMarc cc @matej-svejda | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39912/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39912/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39911 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39911/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39911/comments | https://api.github.com/repos/huggingface/transformers/issues/39911/events | https://github.com/huggingface/transformers/issues/39911 | 3,292,565,173 | I_kwDOCUB6oc7EQI61 | 39,911 | [Gemma3N] Audio processing issue | {
"login": "NickLucche",
"id": 10706289,
"node_id": "MDQ6VXNlcjEwNzA2Mjg5",
"avatar_url": "https://avatars.githubusercontent.com/u/10706289?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NickLucche",
"html_url": "https://github.com/NickLucche",
"followers_url": "https://api.github.com/users/NickLucche/followers",
"following_url": "https://api.github.com/users/NickLucche/following{/other_user}",
"gists_url": "https://api.github.com/users/NickLucche/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NickLucche/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NickLucche/subscriptions",
"organizations_url": "https://api.github.com/users/NickLucche/orgs",
"repos_url": "https://api.github.com/users/NickLucche/repos",
"events_url": "https://api.github.com/users/NickLucche/events{/privacy}",
"received_events_url": "https://api.github.com/users/NickLucche/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T10:37:49 | 2025-08-12T12:08:13 | 2025-08-07T10:34:29 | NONE | null | null | null | null | ### System Info
```
- `transformers` version: 4.53.2
- Platform: Linux-5.15.0-113-generic-x86_64-with-glibc2.35
- Python version: 3.12.11
- Huggingface_hub version: 0.33.4
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
```
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Hey, I am working on integrating Gemma3N on vLLM here https://github.com/vllm-project/vllm/pull/20495.
We're testing out a few random shapes for the audio inputs in our tests and we found inconsistencies between the returned `input_features` and `input_features_mask` shape, which we're currently use to "unpad".
Here's a script for reproducing the issue
```
from transformers import AutoProcessor
import numpy as np
processor = AutoProcessor.from_pretrained("google/gemma-3n-E2B-it")
prompt = "<audio_soft_token>"
for i in range(64, 2048):
# eg audio = np.random.randn(588,)
audio = np.random.randn(i,)
mm_data = {
"audio": [audio],
"sampling_rate": 16000,
}
inputs = processor(
text=prompt,
**mm_data,
return_tensors="pt"
)
out = inputs["input_features"]
mask = inputs["input_features_mask"]
if out.shape[:2] != mask.shape[:2]:
print("Audio length", i)
print(out.shape)
print(mask.shape)
```
And here's a few outputs from the scan run:
```
Audio length 513
torch.Size([1, 128])
torch.Size([1, 4])
Audio length 514
torch.Size([1, 128])
torch.Size([1, 4])
...
Audio length 639
torch.Size([1, 128])
torch.Size([1, 4])
Audio length 640
torch.Size([1, 128])
torch.Size([1, 4])
```
cc @DarkLight1337 @hmellor
### Expected behavior
If you run the same script for a few other sizes:
```
Audio length 64
torch.Size([1, 0, 128])
torch.Size([1, 0])
Audio length 128
torch.Size([1, 0, 128])
torch.Size([1, 0])
Audio length 256
torch.Size([1, 0, 128])
torch.Size([1, 0])
Audio length 512
torch.Size([1, 0, 128])
torch.Size([1, 0])
Audio length 1024
torch.Size([1, 4, 128])
torch.Size([1, 4])
Audio length 2048
torch.Size([1, 10, 128])
torch.Size([1, 10])
``` | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39911/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39911/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39910 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39910/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39910/comments | https://api.github.com/repos/huggingface/transformers/issues/39910/events | https://github.com/huggingface/transformers/issues/39910 | 3,292,503,367 | I_kwDOCUB6oc7EP51H | 39,910 | Question: Llama4 weight reshaping | {
"login": "gskorokhod",
"id": 64529579,
"node_id": "MDQ6VXNlcjY0NTI5NTc5",
"avatar_url": "https://avatars.githubusercontent.com/u/64529579?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gskorokhod",
"html_url": "https://github.com/gskorokhod",
"followers_url": "https://api.github.com/users/gskorokhod/followers",
"following_url": "https://api.github.com/users/gskorokhod/following{/other_user}",
"gists_url": "https://api.github.com/users/gskorokhod/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gskorokhod/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gskorokhod/subscriptions",
"organizations_url": "https://api.github.com/users/gskorokhod/orgs",
"repos_url": "https://api.github.com/users/gskorokhod/repos",
"events_url": "https://api.github.com/users/gskorokhod/events{/privacy}",
"received_events_url": "https://api.github.com/users/gskorokhod/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T10:19:25 | 2025-08-13T09:35:52 | 2025-08-13T09:35:52 | NONE | null | null | null | null | Hi all
I am trying to extract the original Llama4 MoE weights, specifically:
- `experts.w1` (aka `experts.moe_w_in_eD_F`)
- `experts.w3` (aka `experts.moe_w_swiglu_eD_F`)
I need both of these in the shape `[E, D, N]`, where:
- E is the number of experts (16 for Scout)
- D is the embedding dimension (5120)
- N is the intermediate dimension (8192)
I tried just splitting `experts.gate_up_proj` in half along the last dimension to get w1 and w3, but although the dimensions match, the model is outputting nonsense, so I assume the actual order of the weights is wrong.
Could someone help me make sense of this snippet (from `convert_llama4_weights_to_hf`)?
Why is this hard coded indexing / reshaping being done and do you have any suggestions for how to get the original weight back?
```python
elif re.search(r"(gate|up)_proj", new_key):
path = new_key.split(".")
gate_key = re.sub(r"(gate|up)_proj", lambda m: "gate_proj", new_key)
up_key = re.sub(r"(gate|up)_proj", lambda m: "up_proj", new_key)
if gate_key == new_key:
state_dict[new_key] = torch.cat(current_parameter, dim=concat_dim)
elif new_key == up_key:
if "experts" not in new_key:
state_dict[new_key] = torch.cat(current_parameter, dim=concat_dim)
else:
# gate_proj = moe_w_in_eD_F = w1
gate_proj = state_dict.pop(gate_key)
gate_proj = [
gate_proj.reshape(num_experts, -1, 8, 1024)[:, :, k, :].reshape(num_experts, -1, 1024)
for k in range(8)
]
gate_proj = torch.cat(gate_proj, dim=-1)
# up_proj = moe_w_swiglu_eD_F = w3
up_proj = [
k.reshape(num_experts, -1, 8, 1024).reshape(num_experts, -1, 1024)
for k in current_parameter
]
up_proj = torch.cat(up_proj, dim=-1)
gate_up_proj = torch.cat((gate_proj, up_proj), dim=-1)
new_key = new_key.replace("up_proj", "gate_up_proj")
state_dict[new_key] = gate_up_proj.contiguous()
tqdm.write(f"Processing: {key.ljust(50)} ->\t {new_key}, {state_dict[new_key].shape}")
```
Thank you! | {
"login": "gskorokhod",
"id": 64529579,
"node_id": "MDQ6VXNlcjY0NTI5NTc5",
"avatar_url": "https://avatars.githubusercontent.com/u/64529579?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gskorokhod",
"html_url": "https://github.com/gskorokhod",
"followers_url": "https://api.github.com/users/gskorokhod/followers",
"following_url": "https://api.github.com/users/gskorokhod/following{/other_user}",
"gists_url": "https://api.github.com/users/gskorokhod/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gskorokhod/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gskorokhod/subscriptions",
"organizations_url": "https://api.github.com/users/gskorokhod/orgs",
"repos_url": "https://api.github.com/users/gskorokhod/repos",
"events_url": "https://api.github.com/users/gskorokhod/events{/privacy}",
"received_events_url": "https://api.github.com/users/gskorokhod/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39910/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39910/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39909 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39909/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39909/comments | https://api.github.com/repos/huggingface/transformers/issues/39909/events | https://github.com/huggingface/transformers/pull/39909 | 3,292,325,840 | PR_kwDOCUB6oc6iKCcO | 39,909 | Update object_detection.md | {
"login": "ppaanngggg",
"id": 6350479,
"node_id": "MDQ6VXNlcjYzNTA0Nzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6350479?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ppaanngggg",
"html_url": "https://github.com/ppaanngggg",
"followers_url": "https://api.github.com/users/ppaanngggg/followers",
"following_url": "https://api.github.com/users/ppaanngggg/following{/other_user}",
"gists_url": "https://api.github.com/users/ppaanngggg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ppaanngggg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ppaanngggg/subscriptions",
"organizations_url": "https://api.github.com/users/ppaanngggg/orgs",
"repos_url": "https://api.github.com/users/ppaanngggg/repos",
"events_url": "https://api.github.com/users/ppaanngggg/events{/privacy}",
"received_events_url": "https://api.github.com/users/ppaanngggg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T09:31:39 | 2025-08-05T14:07:52 | 2025-08-05T14:07:21 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39909",
"html_url": "https://github.com/huggingface/transformers/pull/39909",
"diff_url": "https://github.com/huggingface/transformers/pull/39909.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39909.patch",
"merged_at": "2025-08-05T14:07:21"
} | # What does this PR do?
fix code snippet error in object detection docs.
## Before submitting
- [✓] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [✓] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [✓] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39909/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39909/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39908 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39908/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39908/comments | https://api.github.com/repos/huggingface/transformers/issues/39908/events | https://github.com/huggingface/transformers/pull/39908 | 3,292,194,041 | PR_kwDOCUB6oc6iJlv0 | 39,908 | Update dynamic attnt setter for multimodals | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T08:51:53 | 2025-08-14T19:46:14 | 2025-08-14T19:46:14 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39908",
"html_url": "https://github.com/huggingface/transformers/pull/39908",
"diff_url": "https://github.com/huggingface/transformers/pull/39908.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39908.patch",
"merged_at": "2025-08-14T19:46:14"
} | # What does this PR do?
Dynamic attn setting doesn't work for multimodal currently and also for any other code where the attention layer is imported, and not defined. This PR updates the heuristic to check if an attention layer exists in the model code as well.
Also updated the warning when requested attention wasn't found to `ValueError`. Most ppl don't read warnings and raising an explicit error is more useful.
The tests are now running on 36 multimodal model classes and are all green | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39908/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39908/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39907 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39907/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39907/comments | https://api.github.com/repos/huggingface/transformers/issues/39907/events | https://github.com/huggingface/transformers/issues/39907 | 3,292,016,495 | I_kwDOCUB6oc7EOC9v | 39,907 | Hidden torchvision>=0.19.0 dependency results in quiet import failures of e.g. PreTrainedModel | {
"login": "tomaarsen",
"id": 37621491,
"node_id": "MDQ6VXNlcjM3NjIxNDkx",
"avatar_url": "https://avatars.githubusercontent.com/u/37621491?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tomaarsen",
"html_url": "https://github.com/tomaarsen",
"followers_url": "https://api.github.com/users/tomaarsen/followers",
"following_url": "https://api.github.com/users/tomaarsen/following{/other_user}",
"gists_url": "https://api.github.com/users/tomaarsen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tomaarsen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tomaarsen/subscriptions",
"organizations_url": "https://api.github.com/users/tomaarsen/orgs",
"repos_url": "https://api.github.com/users/tomaarsen/repos",
"events_url": "https://api.github.com/users/tomaarsen/events{/privacy}",
"received_events_url": "https://api.github.com/users/tomaarsen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | [] | 2025-08-05T07:53:54 | 2025-08-13T15:13:43 | 2025-08-13T15:13:43 | MEMBER | null | null | null | null | Hello!
The changes from #37055 introduce a dependency of `torchvision>=0.19.0` as they rely on `InterpolationMode.NEAREST_EXACT` from `torchvision.transforms.InterpolationMode`, whereas previously I was able to use older versions. This interpolation mode was added in https://github.com/pytorch/vision/pull/6754, which was first released in 0.19.0.
Even worse: when using older versions of `torchvision`, importing parts of `transformers` will quietly (!!!) fail, e.g.:
```python
from transformers import PreTrainedModel
```
```
ImportError: cannot import name 'PreTrainedModel' from 'transformers'
```
The underlying source only becomes apparent when importing with the full path:
```python
from transformers.modeling_utils import PreTrainedModel
```
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "[sic]\lib\site-packages\transformers\modeling_utils.py", line 74, in <module>
from .loss.loss_utils import LOSS_MAPPING
File "[sic]\lib\site-packages\transformers\loss\loss_utils.py", line 21, in <module>
from .loss_d_fine import DFineForObjectDetectionLoss
File "[sic]\lib\site-packages\transformers\loss\loss_d_fine.py", line 21, in <module>
from .loss_for_object_detection import (
File "[sic]\lib\site-packages\transformers\loss\loss_for_object_detection.py", line 32, in <module>
from transformers.image_transforms import center_to_corners_format
File "[sic]\lib\site-packages\transformers\image_transforms.py", line 22, in <module>
from .image_utils import (
File "[sic]\lib\site-packages\transformers\image_utils.py", line 62, in <module>
PILImageResampling.NEAREST: InterpolationMode.NEAREST_EXACT,
File "C:\Users\tom\.conda\envs\setfit\lib\enum.py", line 429, in __getattr__
raise AttributeError(name) from None
AttributeError: NEAREST_EXACT
```
I'm fine with upgrading my `torchvision`, but perhaps that should be required (via `setup.py`), and we really should be wary with hidden import errors.
cc @yonigozlan @zshn25 @Yann-CV
cc @ArthurZucker
- Tom Aarsen
_Originally posted by @tomaarsen in https://github.com/huggingface/transformers/issues/37055#issuecomment-3153954171_
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39907/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39906 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39906/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39906/comments | https://api.github.com/repos/huggingface/transformers/issues/39906/events | https://github.com/huggingface/transformers/pull/39906 | 3,291,883,994 | PR_kwDOCUB6oc6iIjTm | 39,906 | [`Exaone4`] Fixes the attn implementation! | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T07:13:37 | 2025-08-05T07:29:17 | 2025-08-05T07:29:16 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39906",
"html_url": "https://github.com/huggingface/transformers/pull/39906",
"diff_url": "https://github.com/huggingface/transformers/pull/39906.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39906.patch",
"merged_at": "2025-08-05T07:29:16"
} | # What does this PR do?
There was a typo in the config!
Should superseed #39698 | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39906/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39906/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39905 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39905/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39905/comments | https://api.github.com/repos/huggingface/transformers/issues/39905/events | https://github.com/huggingface/transformers/pull/39905 | 3,291,786,219 | PR_kwDOCUB6oc6iIOS- | 39,905 | run model debugging with forward arg | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T06:36:25 | 2025-08-05T13:46:21 | 2025-08-05T13:46:19 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39905",
"html_url": "https://github.com/huggingface/transformers/pull/39905",
"diff_url": "https://github.com/huggingface/transformers/pull/39905.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39905.patch",
"merged_at": "2025-08-05T13:46:19"
} | # What does this PR do?
Integrate the amazing model_debugging utils directly in all models that have `check_model_inputs` to just need to pass `debug_io` and `debug_io_dir` and `prune_layers` | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39905/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39905/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39904 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39904/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39904/comments | https://api.github.com/repos/huggingface/transformers/issues/39904/events | https://github.com/huggingface/transformers/pull/39904 | 3,291,625,770 | PR_kwDOCUB6oc6iHrMo | 39,904 | docs: fix typo in 'quantization-aware training' | {
"login": "luckyvickyricky",
"id": 75977640,
"node_id": "MDQ6VXNlcjc1OTc3NjQw",
"avatar_url": "https://avatars.githubusercontent.com/u/75977640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luckyvickyricky",
"html_url": "https://github.com/luckyvickyricky",
"followers_url": "https://api.github.com/users/luckyvickyricky/followers",
"following_url": "https://api.github.com/users/luckyvickyricky/following{/other_user}",
"gists_url": "https://api.github.com/users/luckyvickyricky/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luckyvickyricky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luckyvickyricky/subscriptions",
"organizations_url": "https://api.github.com/users/luckyvickyricky/orgs",
"repos_url": "https://api.github.com/users/luckyvickyricky/repos",
"events_url": "https://api.github.com/users/luckyvickyricky/events{/privacy}",
"received_events_url": "https://api.github.com/users/luckyvickyricky/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T05:31:22 | 2025-08-06T14:53:30 | 2025-08-06T14:52:43 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39904",
"html_url": "https://github.com/huggingface/transformers/pull/39904",
"diff_url": "https://github.com/huggingface/transformers/pull/39904.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39904.patch",
"merged_at": "2025-08-06T14:52:43"
} | # What does this PR do?
This PR fixes a minor typo in the documentation:
- "quantization-aware trainin" → "quantization-aware training"
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@jungnerd, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S
<!--
Documentation: @stevhliu
-->
Once the translation crew members listed above have checked and approved this PR, I will mention maintainer for final review and merging.
Thank you!
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39904/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39904/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39903 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39903/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39903/comments | https://api.github.com/repos/huggingface/transformers/issues/39903/events | https://github.com/huggingface/transformers/pull/39903 | 3,291,613,498 | PR_kwDOCUB6oc6iHoiY | 39,903 | 🌐 [i18n-KO] Translated clipseg.md to Korean | {
"login": "HyunZ118",
"id": 156191095,
"node_id": "U_kgDOCU9Jdw",
"avatar_url": "https://avatars.githubusercontent.com/u/156191095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HyunZ118",
"html_url": "https://github.com/HyunZ118",
"followers_url": "https://api.github.com/users/HyunZ118/followers",
"following_url": "https://api.github.com/users/HyunZ118/following{/other_user}",
"gists_url": "https://api.github.com/users/HyunZ118/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HyunZ118/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HyunZ118/subscriptions",
"organizations_url": "https://api.github.com/users/HyunZ118/orgs",
"repos_url": "https://api.github.com/users/HyunZ118/repos",
"events_url": "https://api.github.com/users/HyunZ118/events{/privacy}",
"received_events_url": "https://api.github.com/users/HyunZ118/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T05:24:42 | 2025-09-12T00:07:24 | 2025-09-12T00:07:24 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39903",
"html_url": "https://github.com/huggingface/transformers/pull/39903",
"diff_url": "https://github.com/huggingface/transformers/pull/39903.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39903.patch",
"merged_at": "2025-09-12T00:07:24"
} | # What does this PR do?
Translated the clipseg.md file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang
## Before submittin
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39903/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39902 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39902/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39902/comments | https://api.github.com/repos/huggingface/transformers/issues/39902/events | https://github.com/huggingface/transformers/pull/39902 | 3,291,541,487 | PR_kwDOCUB6oc6iHZHU | 39,902 | chore: update Deformable_Detr model card | {
"login": "arpon-kapuria",
"id": 83688431,
"node_id": "MDQ6VXNlcjgzNjg4NDMx",
"avatar_url": "https://avatars.githubusercontent.com/u/83688431?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arpon-kapuria",
"html_url": "https://github.com/arpon-kapuria",
"followers_url": "https://api.github.com/users/arpon-kapuria/followers",
"following_url": "https://api.github.com/users/arpon-kapuria/following{/other_user}",
"gists_url": "https://api.github.com/users/arpon-kapuria/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arpon-kapuria/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arpon-kapuria/subscriptions",
"organizations_url": "https://api.github.com/users/arpon-kapuria/orgs",
"repos_url": "https://api.github.com/users/arpon-kapuria/repos",
"events_url": "https://api.github.com/users/arpon-kapuria/events{/privacy}",
"received_events_url": "https://api.github.com/users/arpon-kapuria/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T04:43:58 | 2025-08-06T19:45:14 | 2025-08-06T19:45:14 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39902",
"html_url": "https://github.com/huggingface/transformers/pull/39902",
"diff_url": "https://github.com/huggingface/transformers/pull/39902.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39902.patch",
"merged_at": "2025-08-06T19:45:14"
} | # What does this PR do?
This PR improves the model card of Deformable DETR.
## Before submitting
- [x] This PR fixes a typo or improves the docs
## Who can review?
Anyone in the community is free to review the PR once the tests have passed.
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39902/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39902/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39901 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39901/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39901/comments | https://api.github.com/repos/huggingface/transformers/issues/39901/events | https://github.com/huggingface/transformers/pull/39901 | 3,291,530,293 | PR_kwDOCUB6oc6iHWsr | 39,901 | 🌐 [i18n-KO] Translated `fp_quant` to Korean | {
"login": "maximizemaxwell",
"id": 138701551,
"node_id": "U_kgDOCERq7w",
"avatar_url": "https://avatars.githubusercontent.com/u/138701551?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/maximizemaxwell",
"html_url": "https://github.com/maximizemaxwell",
"followers_url": "https://api.github.com/users/maximizemaxwell/followers",
"following_url": "https://api.github.com/users/maximizemaxwell/following{/other_user}",
"gists_url": "https://api.github.com/users/maximizemaxwell/gists{/gist_id}",
"starred_url": "https://api.github.com/users/maximizemaxwell/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/maximizemaxwell/subscriptions",
"organizations_url": "https://api.github.com/users/maximizemaxwell/orgs",
"repos_url": "https://api.github.com/users/maximizemaxwell/repos",
"events_url": "https://api.github.com/users/maximizemaxwell/events{/privacy}",
"received_events_url": "https://api.github.com/users/maximizemaxwell/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T04:36:27 | 2025-10-10T02:04:25 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39901",
"html_url": "https://github.com/huggingface/transformers/pull/39901",
"diff_url": "https://github.com/huggingface/transformers/pull/39901.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39901.patch",
"merged_at": null
} | # What does this PR do?
Translated the `fp_quant.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [X] Check for missing / redundant translations (번역 누락/중복 검사)
- [X] Grammar Check (맞춤법 검사)
- [X] Review or Add new terms to glossary (용어 확인 및 추가)
- [X] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [X] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @amo33, @skwh54 , @D15M4S
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39901/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39901/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39900 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39900/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39900/comments | https://api.github.com/repos/huggingface/transformers/issues/39900/events | https://github.com/huggingface/transformers/issues/39900 | 3,291,408,282 | I_kwDOCUB6oc7ELuea | 39,900 | Weights not tied when loading `from_pretrained` with a wrapped model | {
"login": "bryant1410",
"id": 3905501,
"node_id": "MDQ6VXNlcjM5MDU1MDE=",
"avatar_url": "https://avatars.githubusercontent.com/u/3905501?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bryant1410",
"html_url": "https://github.com/bryant1410",
"followers_url": "https://api.github.com/users/bryant1410/followers",
"following_url": "https://api.github.com/users/bryant1410/following{/other_user}",
"gists_url": "https://api.github.com/users/bryant1410/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bryant1410/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bryant1410/subscriptions",
"organizations_url": "https://api.github.com/users/bryant1410/orgs",
"repos_url": "https://api.github.com/users/bryant1410/repos",
"events_url": "https://api.github.com/users/bryant1410/events{/privacy}",
"received_events_url": "https://api.github.com/users/bryant1410/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-05T03:22:55 | 2025-08-12T14:21:48 | 2025-08-08T14:03:17 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.53.3
- Platform: Linux-5.15.134***-x86_64-with-glibc2.35
- Python version: 3.10.18
- Huggingface_hub version: 0.33.5
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA A100-SXM4-40GB
### Who can help?
@ArthurZucker @Cyrilvallez
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Run:
```python
import transformers
class Config(transformers.PretrainedConfig):
pass
class Model(transformers.PreTrainedModel):
config_class = Config
def __init__(self, config):
super().__init__(config)
self.another_model = transformers.AutoModelForSeq2SeqLM.from_pretrained("t5-base")
a = Model(Config())
print(a.another_model.lm_head.weight)
a.save_pretrained("/tmp/abc")
b = Model.from_pretrained("/tmp/abc")
print(b.another_model.lm_head.weight)
```
and you'll get the output:
```
Parameter containing:
tensor([[ -0.7539, 0.5977, -2.4375, ..., 1.2500, -0.7891, 3.5156],
[ 11.3750, -4.8750, 9.0625, ..., 4.8438, 14.3750, -5.7812],
[-16.6250, 11.0625, -20.8750, ..., 10.6875, 22.2500, 25.0000],
...,
[ 2.2344, 6.7500, -11.0625, ..., -11.3125, 13.5625, 16.6250],
[ 4.2500, 5.1250, -12.2500, ..., -11.9375, 13.5000, 17.0000],
[ 4.0625, 6.9688, -12.2500, ..., -11.3750, 11.9375, 16.6250]],
requires_grad=True)
Some weights of T5ForConditionalGeneration were not initialized from the model checkpoint at t5-base and are newly initialized: ['lm_head.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Some weights of Model were not initialized from the model checkpoint at /tmp/abc and are newly initialized: ['another_model.lm_head.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Parameter containing:
tensor([[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
...,
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.],
[0., 0., 0., ..., 0., 0., 0.]], requires_grad=True)
```
### Expected behavior
That the last `print` statement outputs the same as the first one, instead of zeros:
```
tensor([[ -0.7539, 0.5977, -2.4375, ..., 1.2500, -0.7891, 3.5156],
[ 11.3750, -4.8750, 9.0625, ..., 4.8438, 14.3750, -5.7812],
[-16.6250, 11.0625, -20.8750, ..., 10.6875, 22.2500, 25.0000],
...,
[ 2.2344, 6.7500, -11.0625, ..., -11.3125, 13.5625, 16.6250],
[ 4.2500, 5.1250, -12.2500, ..., -11.9375, 13.5000, 17.0000],
[ 4.0625, 6.9688, -12.2500, ..., -11.3750, 11.9375, 16.6250]],
requires_grad=True)
``` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39900/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39900/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39899 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39899/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39899/comments | https://api.github.com/repos/huggingface/transformers/issues/39899/events | https://github.com/huggingface/transformers/pull/39899 | 3,291,399,169 | PR_kwDOCUB6oc6iG6fc | 39,899 | [model] Support MiniCPM-V 4.0 | {
"login": "tc-mb",
"id": 157115220,
"node_id": "U_kgDOCV1jVA",
"avatar_url": "https://avatars.githubusercontent.com/u/157115220?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tc-mb",
"html_url": "https://github.com/tc-mb",
"followers_url": "https://api.github.com/users/tc-mb/followers",
"following_url": "https://api.github.com/users/tc-mb/following{/other_user}",
"gists_url": "https://api.github.com/users/tc-mb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tc-mb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tc-mb/subscriptions",
"organizations_url": "https://api.github.com/users/tc-mb/orgs",
"repos_url": "https://api.github.com/users/tc-mb/repos",
"events_url": "https://api.github.com/users/tc-mb/events{/privacy}",
"received_events_url": "https://api.github.com/users/tc-mb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-05T03:16:32 | 2025-09-01T03:58:21 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39899",
"html_url": "https://github.com/huggingface/transformers/pull/39899",
"diff_url": "https://github.com/huggingface/transformers/pull/39899.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39899.patch",
"merged_at": null
} | This pull request supports the MiniCPM-V 4.0 model, which will be released soon. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39899/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39899/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39898 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39898/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39898/comments | https://api.github.com/repos/huggingface/transformers/issues/39898/events | https://github.com/huggingface/transformers/pull/39898 | 3,291,159,378 | PR_kwDOCUB6oc6iGHzk | 39,898 | Replace video_fps with fps in tests | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-05T00:39:27 | 2025-08-05T11:49:39 | 2025-08-05T10:39:55 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39898",
"html_url": "https://github.com/huggingface/transformers/pull/39898",
"diff_url": "https://github.com/huggingface/transformers/pull/39898.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39898.patch",
"merged_at": "2025-08-05T10:39:55"
} | # What does this PR do?
Use new parameter name to reduce the test warnings.
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39898/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39898/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39897 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39897/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39897/comments | https://api.github.com/repos/huggingface/transformers/issues/39897/events | https://github.com/huggingface/transformers/pull/39897 | 3,291,020,539 | PR_kwDOCUB6oc6iFqPi | 39,897 | Modular fix: remove the model name in `find_file_type` | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T23:00:12 | 2025-08-06T23:31:08 | 2025-08-06T23:31:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39897",
"html_url": "https://github.com/huggingface/transformers/pull/39897",
"diff_url": "https://github.com/huggingface/transformers/pull/39897.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39897.patch",
"merged_at": "2025-08-06T23:31:08"
} | # What does this PR do?
Remove the model name in `find_file_type` in `modular_model_converter.py` as it can cause issues when the module name is ambiguous.
For example, in [this PR](https://github.com/huggingface/transformers/pull/32317), the model name is sam2_video (Sam2Video), and I want to use modular for its processor, Sam2VideoProcessor. The current modular associate this module to the video processor file because it contains VideoProcessor.
The fix removes the "Sam2Video" part, clearing the ambiguity.
Tested on main with `python utils/check_modular_conversion.py --fix_and_overwrite --check_all` and it doesn't break any other model!
Cc @Cyrilvallez | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39897/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39896 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39896/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39896/comments | https://api.github.com/repos/huggingface/transformers/issues/39896/events | https://github.com/huggingface/transformers/issues/39896 | 3,290,778,020 | I_kwDOCUB6oc7EJUmk | 39,896 | v4.54.1 average_tokens_across_devices=True would cause "ValueError: Tensors must be CUDA and dense" when gathering num_items_in_batch | {
"login": "chiquitita-101",
"id": 19623404,
"node_id": "MDQ6VXNlcjE5NjIzNDA0",
"avatar_url": "https://avatars.githubusercontent.com/u/19623404?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chiquitita-101",
"html_url": "https://github.com/chiquitita-101",
"followers_url": "https://api.github.com/users/chiquitita-101/followers",
"following_url": "https://api.github.com/users/chiquitita-101/following{/other_user}",
"gists_url": "https://api.github.com/users/chiquitita-101/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chiquitita-101/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chiquitita-101/subscriptions",
"organizations_url": "https://api.github.com/users/chiquitita-101/orgs",
"repos_url": "https://api.github.com/users/chiquitita-101/repos",
"events_url": "https://api.github.com/users/chiquitita-101/events{/privacy}",
"received_events_url": "https://api.github.com/users/chiquitita-101/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-04T21:04:01 | 2025-09-10T16:49:44 | 2025-09-10T16:49:44 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.54.1
- Platform: Linux-5.4.144-16.el7pie-x86_64-with-glibc2.35
- Python version: 3.12.5
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: 0.17.4
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: True
- Using GPU in script?: True
- GPU type: Tesla V100-SXM2-32GB
### Who can help?
I don't know
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
[rank0]: Traceback (most recent call last):
[rank0]: File "/mnt/task_runtime/labels/train.py", line 219, in <module>
[rank0]: train()
[rank0]: File "/mnt/task_runtime/labels/train.py", line 194, in train
[rank0]: trainer.train()
[rank0]: File "/miniforge/lib/python3.12/site-packages/transformers/trainer.py", line 2237, in train
[rank0]: return inner_training_loop(
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/transformers/trainer.py", line 2532, in _inner_training_loop
[rank0]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches, args.device)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/transformers/trainer.py", line 5378, in get_batch_samples
[rank0]: **_num_items_in_batch = self.accelerator.gather(num_items_in_batch).sum()_**
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/accelerator.py", line 2794, in gather
[rank0]: return gather(tensor)
[rank0]: ^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/utils/operations.py", line 371, in wrapper
[rank0]: return function(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/utils/operations.py", line 432, in gather
[rank0]: return _gpu_gather(tensor)
[rank0]: ^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/utils/operations.py", line 351, in _gpu_gather
[rank0]: return recursively_apply(_gpu_gather_one, tensor, error_on_other_type=True)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/utils/operations.py", line 126, in recursively_apply
[rank0]: return func(data, *args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/accelerate/utils/operations.py", line 341, in _gpu_gather_one
[rank0]: gather_op(output_tensors, tensor)
[rank0]: File "/miniforge/lib/python3.12/site-packages/torch/distributed/c10d_logger.py", line 81, in wrapper
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/miniforge/lib/python3.12/site-packages/torch/distributed/distributed_c10d.py", line 3836, in all_gather_into_tensor
[rank0]: work = group._allgather_base(output_tensor, input_tensor, opts)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: ValueError: Tensors must be CUDA and dense
transformers v4.54.1 + deepseep multi-gpu training
### Expected behavior
No error | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39896/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39896/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39895 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39895/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39895/comments | https://api.github.com/repos/huggingface/transformers/issues/39895/events | https://github.com/huggingface/transformers/pull/39895 | 3,290,280,596 | PR_kwDOCUB6oc6iDJ92 | 39,895 | Add Videoprism | {
"login": "MHRDYN7",
"id": 113298714,
"node_id": "U_kgDOBsDNGg",
"avatar_url": "https://avatars.githubusercontent.com/u/113298714?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MHRDYN7",
"html_url": "https://github.com/MHRDYN7",
"followers_url": "https://api.github.com/users/MHRDYN7/followers",
"following_url": "https://api.github.com/users/MHRDYN7/following{/other_user}",
"gists_url": "https://api.github.com/users/MHRDYN7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MHRDYN7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MHRDYN7/subscriptions",
"organizations_url": "https://api.github.com/users/MHRDYN7/orgs",
"repos_url": "https://api.github.com/users/MHRDYN7/repos",
"events_url": "https://api.github.com/users/MHRDYN7/events{/privacy}",
"received_events_url": "https://api.github.com/users/MHRDYN7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-08-04T17:37:52 | 2025-10-23T12:23:36 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39895",
"html_url": "https://github.com/huggingface/transformers/pull/39895",
"diff_url": "https://github.com/huggingface/transformers/pull/39895.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39895.patch",
"merged_at": null
} | Fixes #39893. This pr adds the VideoPrism model by google deepmind. [Original repo](https://github.com/google-deepmind/videoprism) | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39895/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39895/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39894 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39894/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39894/comments | https://api.github.com/repos/huggingface/transformers/issues/39894/events | https://github.com/huggingface/transformers/pull/39894 | 3,290,276,691 | PR_kwDOCUB6oc6iDJKV | 39,894 | [docs] Add reference to HF-maintained `custom_generate` collections | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T17:36:09 | 2025-08-12T16:38:04 | 2025-08-12T16:38:01 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39894",
"html_url": "https://github.com/huggingface/transformers/pull/39894",
"diff_url": "https://github.com/huggingface/transformers/pull/39894.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39894.patch",
"merged_at": "2025-08-12T16:38:01"
} | # What does this PR do?
- Add reference to HF-maintained `custom_generate` collections
- Rename "custom decoding method" -> "custom generation method" in the docs | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39894/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39894/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39893 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39893/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39893/comments | https://api.github.com/repos/huggingface/transformers/issues/39893/events | https://github.com/huggingface/transformers/issues/39893 | 3,290,275,644 | I_kwDOCUB6oc7EHZ88 | 39,893 | Add VideoPrism | {
"login": "MHRDYN7",
"id": 113298714,
"node_id": "U_kgDOBsDNGg",
"avatar_url": "https://avatars.githubusercontent.com/u/113298714?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MHRDYN7",
"html_url": "https://github.com/MHRDYN7",
"followers_url": "https://api.github.com/users/MHRDYN7/followers",
"following_url": "https://api.github.com/users/MHRDYN7/following{/other_user}",
"gists_url": "https://api.github.com/users/MHRDYN7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MHRDYN7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MHRDYN7/subscriptions",
"organizations_url": "https://api.github.com/users/MHRDYN7/orgs",
"repos_url": "https://api.github.com/users/MHRDYN7/repos",
"events_url": "https://api.github.com/users/MHRDYN7/events{/privacy}",
"received_events_url": "https://api.github.com/users/MHRDYN7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-08-04T17:35:42 | 2025-08-08T16:38:18 | null | CONTRIBUTOR | null | null | null | null | ### Model description
I'd like to contribute the VideoPrism model by google deepmind. The code base (written in flax) was made public in June and the base model weights alone has 2.7k downloads on [HF hub](https://huggingface.co/google/videoprism-base-f16r288) with almost no support except for the original repo. There are two models, a foundational video model and a video-text model, with base and large sizes.
Why add the model
* They claim that the model "achieves state-of-the-art performance on 31 out of 33 public video understanding benchmarks using a single frozen model."
* The architecture is unique with a spatial encoder layer followed by a temporal encoder layer (not fused spatio-temporal encoders) with no cls token. This architectures was introduced in the [VIVIT paper](https://arxiv.org/abs/2103.15691) but the HF VIVIT model implements only one of the four architectures proposed in that paper even though the weight for this architecture was released back then.
What the model lacks
* The VideoPrism [paper](https://arxiv.org/abs/2402.13217) was released in feb 2024, so it is old and therefore the architecture does not include modern day components like rotary positional embeddings (used in vjepa2) or GLU.
* The foundational video model does not come with the head weights and they don't even plan to release those and that's why we'll need to train the final layer on something like the kinetics-400 dataset for classification model.
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
Paper link: https://arxiv.org/abs/2402.13217
Repo link: https://github.com/google-deepmind/videoprism
checkpoint: [video model](https://huggingface.co/google/videoprism-base-f16r288), [video-text model](https://huggingface.co/google/videoprism-lvt-base-f16r288)
@qubvel @NielsRogge I've already converted the base model's flax implementation to HF and got the exact logits for the forward pass. I'll go ahead with the rest if I get the green light for the contribution. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39893/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39893/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39892 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39892/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39892/comments | https://api.github.com/repos/huggingface/transformers/issues/39892/events | https://github.com/huggingface/transformers/pull/39892 | 3,289,835,625 | PR_kwDOCUB6oc6iBpJt | 39,892 | fix: Add einops as a core dependency | {
"login": "Hashbrownsss",
"id": 142291877,
"node_id": "U_kgDOCHszpQ",
"avatar_url": "https://avatars.githubusercontent.com/u/142291877?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hashbrownsss",
"html_url": "https://github.com/Hashbrownsss",
"followers_url": "https://api.github.com/users/Hashbrownsss/followers",
"following_url": "https://api.github.com/users/Hashbrownsss/following{/other_user}",
"gists_url": "https://api.github.com/users/Hashbrownsss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hashbrownsss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hashbrownsss/subscriptions",
"organizations_url": "https://api.github.com/users/Hashbrownsss/orgs",
"repos_url": "https://api.github.com/users/Hashbrownsss/repos",
"events_url": "https://api.github.com/users/Hashbrownsss/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hashbrownsss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T15:06:38 | 2025-08-07T18:58:08 | 2025-08-07T18:58:07 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39892",
"html_url": "https://github.com/huggingface/transformers/pull/39892",
"diff_url": "https://github.com/huggingface/transformers/pull/39892.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39892.patch",
"merged_at": null
} | # What does this PR do?
This PR addresses the `ModuleNotFoundError` for the `einops` library by adding it as a core dependency.
It fixes issue #39811, where certain parts of the `transformers` library, specifically in the flash attention code, were trying to import `einops` without it being a declared dependency. This bug prevented users from successfully running models that rely on this library.
By adding `einops` to `setup.py`, this PR ensures that the library is automatically installed for all users, making the `transformers` package more robust and reliable.
Fixes #39811
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. This PR fixes issue #39811.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
> This change is a dependency fix and does not require a documentation update.
- [ ] Did you write any new necessary tests?
> This change is a dependency fix and does not require a new test. The existing test suite was used to confirm that the `einops` dependency is now correctly installed.
## Who can review?
@ArthurZucker
@ivarflakstad
@iforgetmyname | {
"login": "Hashbrownsss",
"id": 142291877,
"node_id": "U_kgDOCHszpQ",
"avatar_url": "https://avatars.githubusercontent.com/u/142291877?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hashbrownsss",
"html_url": "https://github.com/Hashbrownsss",
"followers_url": "https://api.github.com/users/Hashbrownsss/followers",
"following_url": "https://api.github.com/users/Hashbrownsss/following{/other_user}",
"gists_url": "https://api.github.com/users/Hashbrownsss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hashbrownsss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hashbrownsss/subscriptions",
"organizations_url": "https://api.github.com/users/Hashbrownsss/orgs",
"repos_url": "https://api.github.com/users/Hashbrownsss/repos",
"events_url": "https://api.github.com/users/Hashbrownsss/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hashbrownsss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39892/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39892/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39891 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39891/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39891/comments | https://api.github.com/repos/huggingface/transformers/issues/39891/events | https://github.com/huggingface/transformers/pull/39891 | 3,289,722,376 | PR_kwDOCUB6oc6iBP3L | 39,891 | Fix misleading WandB error when WANDB_DISABLED is set | {
"login": "notkisk",
"id": 107971634,
"node_id": "U_kgDOBm-EMg",
"avatar_url": "https://avatars.githubusercontent.com/u/107971634?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/notkisk",
"html_url": "https://github.com/notkisk",
"followers_url": "https://api.github.com/users/notkisk/followers",
"following_url": "https://api.github.com/users/notkisk/following{/other_user}",
"gists_url": "https://api.github.com/users/notkisk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/notkisk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/notkisk/subscriptions",
"organizations_url": "https://api.github.com/users/notkisk/orgs",
"repos_url": "https://api.github.com/users/notkisk/repos",
"events_url": "https://api.github.com/users/notkisk/events{/privacy}",
"received_events_url": "https://api.github.com/users/notkisk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T14:37:51 | 2025-08-05T10:18:54 | 2025-08-05T10:18:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39891",
"html_url": "https://github.com/huggingface/transformers/pull/39891",
"diff_url": "https://github.com/huggingface/transformers/pull/39891.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39891.patch",
"merged_at": "2025-08-05T10:18:18"
} | ## Fix misleading error message when `WANDB_DISABLED` is set
### Summary
✅ Fixed the misleading WandB error in `src/transformers/integrations/integration_utils.py:814`.
**Before (confusing error):**
```
RuntimeError: WandbCallback requires wandb to be installed. Run `pip install wandb`.
```
**After (clearer error with actionable guidance):**
```
RuntimeError: You specified `report_to='wandb'` but also set the `WANDB_DISABLED` environment variable.
This disables wandb logging, even though it was explicitly requested.
- To enable wandb logging: unset `WANDB_DISABLED`.
- To disable logging: use `report_to='none'`.
Note: WANDB_DISABLED is deprecated and will be removed in v5.
```
### Details
The previous error message was shown even when `wandb` **was installed**, but disabled via the `WANDB_DISABLED` environment variable. This could mislead users into thinking `wandb` wasn’t installed at all.
This PR:
* Detects when `wandb` is installed but explicitly disabled.
* Raises a targeted error with guidance on how to proceed.
* Preserves the original error message if `wandb` is not installed.
### Implementation
Here is the relevant change:
```python
has_wandb = is_wandb_available()
if not has_wandb:
# Check if wandb is actually installed but disabled via WANDB_DISABLED
if importlib.util.find_spec("wandb") is not None:
wandb_disabled = os.getenv("WANDB_DISABLED", "").upper() in ENV_VARS_TRUE_VALUES
if wandb_disabled:
raise RuntimeError(
"You specified `report_to='wandb'` but also set the `WANDB_DISABLED` environment variable.\n"
"This disables wandb logging, even though it was explicitly requested.\n\n"
"- To enable wandb logging: unset `WANDB_DISABLED`.\n"
"- To disable logging: use `report_to='none'`.\n\n"
"Note: WANDB_DISABLED is deprecated and will be removed in v5."
)
raise RuntimeError("WandbCallback requires wandb to be installed. Run `pip install wandb`.")
```
### Testing
✅ Manually tested using the following script:
```
import os
import importlib.util
from datasets import Dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from trl import SFTTrainer, SFTConfig
import logging
# Simulate W&B being disabled, even though it's installed
os.environ["WANDB_DISABLED"] = "true"
# Optionally set up logging to see the deprecation warning
logging.basicConfig(level=logging.WARNING)
# Confirm wandb is installed but will be treated as unavailable
print("WandB is installed:", importlib.util.find_spec("wandb") is not None)
# Load tokenizer and model (replace with a smaller model if needed for quick tests)
model_name = "gpt2" # Replace with a small HF model if you don't have access
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
# Create a simple dummy dataset
texts = ["Hello, how are you?", "What's the capital of France?"]
dataset = Dataset.from_dict({"text": texts})
# Tokenize
def tokenize(example):
return tokenizer(example["text"], padding="max_length", truncation=True, max_length=64)
tokenized_dataset = dataset.map(tokenize)
# SFTConfig with WandB enabled, but WANDB_DISABLED is also set
training_args = SFTConfig(
output_dir="./outputs",
per_device_train_batch_size=1,
num_train_epochs=1,
logging_steps=1,
learning_rate=1e-5,
report_to="wandb", # 👈 This triggers the issue
run_name="test-wandb-conflict",
)
# Trainer setup
trainer = SFTTrainer(
model=model,
args=training_args,
train_dataset=tokenized_dataset,
tokenizer=tokenizer,
)
# This will trigger the warning and the misleading RuntimeError
trainer.train()
```
the new error message appears correctly when `WANDB_DISABLED=true`.
---
This PR fixes #39878. | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39891/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39891/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39890 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39890/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39890/comments | https://api.github.com/repos/huggingface/transformers/issues/39890/events | https://github.com/huggingface/transformers/pull/39890 | 3,289,699,160 | PR_kwDOCUB6oc6iBKof | 39,890 | 🌐 [i18n-KO] Translated `jamba.md` to Korean | {
"login": "skwh54",
"id": 108786184,
"node_id": "U_kgDOBnvyCA",
"avatar_url": "https://avatars.githubusercontent.com/u/108786184?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/skwh54",
"html_url": "https://github.com/skwh54",
"followers_url": "https://api.github.com/users/skwh54/followers",
"following_url": "https://api.github.com/users/skwh54/following{/other_user}",
"gists_url": "https://api.github.com/users/skwh54/gists{/gist_id}",
"starred_url": "https://api.github.com/users/skwh54/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skwh54/subscriptions",
"organizations_url": "https://api.github.com/users/skwh54/orgs",
"repos_url": "https://api.github.com/users/skwh54/repos",
"events_url": "https://api.github.com/users/skwh54/events{/privacy}",
"received_events_url": "https://api.github.com/users/skwh54/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T14:32:11 | 2025-08-13T15:22:28 | 2025-08-13T15:22:28 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39890",
"html_url": "https://github.com/huggingface/transformers/pull/39890",
"diff_url": "https://github.com/huggingface/transformers/pull/39890.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39890.patch",
"merged_at": "2025-08-13T15:22:28"
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `jamba.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @amo33, @maximizemaxwell, @D15M4S
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39890/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39890/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39889 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39889/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39889/comments | https://api.github.com/repos/huggingface/transformers/issues/39889/events | https://github.com/huggingface/transformers/pull/39889 | 3,289,643,484 | PR_kwDOCUB6oc6iA-CN | 39,889 | send some feedback when manually building doc via comment | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T14:19:38 | 2025-08-04T18:20:49 | 2025-08-04T18:20:48 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39889",
"html_url": "https://github.com/huggingface/transformers/pull/39889",
"diff_url": "https://github.com/huggingface/transformers/pull/39889.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39889.patch",
"merged_at": "2025-08-04T18:20:48"
} | # What does this PR do?
Something like this
https://github.com/ydshieh/transformers/pull/15#issuecomment-3151095765
The changes are mostly copied from some jobs defined in
.github/workflows/self-comment-ci.yml | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39889/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39889/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39888 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39888/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39888/comments | https://api.github.com/repos/huggingface/transformers/issues/39888/events | https://github.com/huggingface/transformers/pull/39888 | 3,289,487,792 | PR_kwDOCUB6oc6iAb2f | 39,888 | Update cohere2 vision test | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T13:35:27 | 2025-08-04T18:08:20 | 2025-08-04T18:08:18 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39888",
"html_url": "https://github.com/huggingface/transformers/pull/39888",
"diff_url": "https://github.com/huggingface/transformers/pull/39888.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39888.patch",
"merged_at": "2025-08-04T18:08:18"
} | # What does this PR do?
Just use dummy model (4 LLM layers), otherwise it won't fit into A10 runner.
The output is still deterministic, just some checkpoint weights are not used in the dummy model (all of its weights are loaded from checkpoint) | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39888/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39888/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39887 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39887/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39887/comments | https://api.github.com/repos/huggingface/transformers/issues/39887/events | https://github.com/huggingface/transformers/pull/39887 | 3,289,450,541 | PR_kwDOCUB6oc6iATdU | 39,887 | Move old generation modes to the Hub 🧹🧹🧹🧽🧽 | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T13:26:35 | 2025-08-11T15:14:02 | 2025-08-11T15:14:02 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39887",
"html_url": "https://github.com/huggingface/transformers/pull/39887",
"diff_url": "https://github.com/huggingface/transformers/pull/39887.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39887.patch",
"merged_at": null
} | This PR cleans a lot old generation strategies in `GenerationMixin` class to prepare for moving several generation modes to a `custom_generate` repository on the Hub: DoLa decoding, contrastive search, group beam search, and constrained beam search.
EDIT: deleting PR to do one at a time | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39887/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39886 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39886/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39886/comments | https://api.github.com/repos/huggingface/transformers/issues/39886/events | https://github.com/huggingface/transformers/pull/39886 | 3,289,445,728 | PR_kwDOCUB6oc6iASXI | 39,886 | 🌐 [i18n-KO] Translated `perf_train_gaudi.md` to Korean | {
"login": "D15M4S",
"id": 122260287,
"node_id": "U_kgDOB0mLPw",
"avatar_url": "https://avatars.githubusercontent.com/u/122260287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/D15M4S",
"html_url": "https://github.com/D15M4S",
"followers_url": "https://api.github.com/users/D15M4S/followers",
"following_url": "https://api.github.com/users/D15M4S/following{/other_user}",
"gists_url": "https://api.github.com/users/D15M4S/gists{/gist_id}",
"starred_url": "https://api.github.com/users/D15M4S/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/D15M4S/subscriptions",
"organizations_url": "https://api.github.com/users/D15M4S/orgs",
"repos_url": "https://api.github.com/users/D15M4S/repos",
"events_url": "https://api.github.com/users/D15M4S/events{/privacy}",
"received_events_url": "https://api.github.com/users/D15M4S/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-04T13:25:27 | 2025-10-29T06:55:51 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39886",
"html_url": "https://github.com/huggingface/transformers/pull/39886",
"diff_url": "https://github.com/huggingface/transformers/pull/39886.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39886.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `perf_train_gaudi.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell
@harheem
@4N3MONE
@yijun-lee
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? -->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39886/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39886/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39885 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39885/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39885/comments | https://api.github.com/repos/huggingface/transformers/issues/39885/events | https://github.com/huggingface/transformers/pull/39885 | 3,289,298,263 | PR_kwDOCUB6oc6h_xqt | 39,885 | Set `torch.backends.cudnn.allow_tf32 = False` for CI | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T12:44:57 | 2025-08-04T14:55:18 | 2025-08-04T14:55:16 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39885",
"html_url": "https://github.com/huggingface/transformers/pull/39885",
"diff_url": "https://github.com/huggingface/transformers/pull/39885.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39885.patch",
"merged_at": "2025-08-04T14:55:16"
} | # What does this PR do?
See https://github.com/pytorch/pytorch/issues/157274#issuecomment-3090791615
(if we don't do this, we have to increase the thresholds in many tests when torch 2.8 is released) | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39885/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39885/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39884 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39884/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39884/comments | https://api.github.com/repos/huggingface/transformers/issues/39884/events | https://github.com/huggingface/transformers/pull/39884 | 3,289,262,174 | PR_kwDOCUB6oc6h_puM | 39,884 | added Textnet fast image processor | {
"login": "rahzaazhar",
"id": 44742531,
"node_id": "MDQ6VXNlcjQ0NzQyNTMx",
"avatar_url": "https://avatars.githubusercontent.com/u/44742531?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rahzaazhar",
"html_url": "https://github.com/rahzaazhar",
"followers_url": "https://api.github.com/users/rahzaazhar/followers",
"following_url": "https://api.github.com/users/rahzaazhar/following{/other_user}",
"gists_url": "https://api.github.com/users/rahzaazhar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rahzaazhar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rahzaazhar/subscriptions",
"organizations_url": "https://api.github.com/users/rahzaazhar/orgs",
"repos_url": "https://api.github.com/users/rahzaazhar/repos",
"events_url": "https://api.github.com/users/rahzaazhar/events{/privacy}",
"received_events_url": "https://api.github.com/users/rahzaazhar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T12:33:49 | 2025-08-11T15:44:32 | 2025-08-11T15:44:32 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39884",
"html_url": "https://github.com/huggingface/transformers/pull/39884",
"diff_url": "https://github.com/huggingface/transformers/pull/39884.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39884.patch",
"merged_at": "2025-08-11T15:44:32"
} | # What does this PR do?
Related [#36978](https://github.com/huggingface/transformers/issues/36978)
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Adds a fast image processor to the textnet model.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39884/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39884/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39883 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39883/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39883/comments | https://api.github.com/repos/huggingface/transformers/issues/39883/events | https://github.com/huggingface/transformers/pull/39883 | 3,289,158,407 | PR_kwDOCUB6oc6h_SkY | 39,883 | Remove deprecated max_size parameter from ConditionalDetrImageProcessor | {
"login": "DarshanKumarGP",
"id": 206693261,
"node_id": "U_kgDODFHjjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/206693261?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarshanKumarGP",
"html_url": "https://github.com/DarshanKumarGP",
"followers_url": "https://api.github.com/users/DarshanKumarGP/followers",
"following_url": "https://api.github.com/users/DarshanKumarGP/following{/other_user}",
"gists_url": "https://api.github.com/users/DarshanKumarGP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarshanKumarGP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarshanKumarGP/subscriptions",
"organizations_url": "https://api.github.com/users/DarshanKumarGP/orgs",
"repos_url": "https://api.github.com/users/DarshanKumarGP/repos",
"events_url": "https://api.github.com/users/DarshanKumarGP/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarshanKumarGP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T12:05:22 | 2025-08-17T11:24:26 | 2025-08-17T11:24:26 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39883",
"html_url": "https://github.com/huggingface/transformers/pull/39883",
"diff_url": "https://github.com/huggingface/transformers/pull/39883.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39883.patch",
"merged_at": null
} | This PR removes all handling of the deprecated max_size parameter from the ConditionalDetrImageProcessor class as per issue #37939.
Closes #37939. | {
"login": "DarshanKumarGP",
"id": 206693261,
"node_id": "U_kgDODFHjjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/206693261?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarshanKumarGP",
"html_url": "https://github.com/DarshanKumarGP",
"followers_url": "https://api.github.com/users/DarshanKumarGP/followers",
"following_url": "https://api.github.com/users/DarshanKumarGP/following{/other_user}",
"gists_url": "https://api.github.com/users/DarshanKumarGP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarshanKumarGP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarshanKumarGP/subscriptions",
"organizations_url": "https://api.github.com/users/DarshanKumarGP/orgs",
"repos_url": "https://api.github.com/users/DarshanKumarGP/repos",
"events_url": "https://api.github.com/users/DarshanKumarGP/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarshanKumarGP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39883/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39883/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39882 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39882/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39882/comments | https://api.github.com/repos/huggingface/transformers/issues/39882/events | https://github.com/huggingface/transformers/issues/39882 | 3,289,039,081 | I_kwDOCUB6oc7ECsDp | 39,882 | [Feature Request] Automatically log parallelism configuration from Accelerate to W&B | {
"login": "WoosungMyung",
"id": 115716986,
"node_id": "U_kgDOBuWzeg",
"avatar_url": "https://avatars.githubusercontent.com/u/115716986?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WoosungMyung",
"html_url": "https://github.com/WoosungMyung",
"followers_url": "https://api.github.com/users/WoosungMyung/followers",
"following_url": "https://api.github.com/users/WoosungMyung/following{/other_user}",
"gists_url": "https://api.github.com/users/WoosungMyung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/WoosungMyung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/WoosungMyung/subscriptions",
"organizations_url": "https://api.github.com/users/WoosungMyung/orgs",
"repos_url": "https://api.github.com/users/WoosungMyung/repos",
"events_url": "https://api.github.com/users/WoosungMyung/events{/privacy}",
"received_events_url": "https://api.github.com/users/WoosungMyung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-08-04T11:22:33 | 2025-08-08T05:23:25 | null | NONE | null | null | null | null | ### Feature request
Automatically log parallelism configuration (TP, DP, etc.) from Accelerate to Weights & Biases config in `WandbCallback`.
Currently, when using `Accelerator`, parallelism sizes (e.g., `tp_size`, `dp_replicate_size`) must be manually recorded to `wandb.config`. These values are essential when conducting experiments with different distributed training. This feature request proposes automatically logging `parallelism_config._sizes` to Weights & Biases within `WandbCallback.on_train_begin()`, so that users do not have to manually add this information.
### Motivation
Currently, users must manually inject this information, and this process can and should be automated.
This improves reproducibility, saves developer time, and helps analyze the training environment in distributed learning environment using HF accelerate.
### Your contribution
I am happy to submit a PR. The PR would be minimal. The code will check if the model has an `accelerator` attribute and use the `_sizes` dictionary like so:
if hasattr(model, "accelerator"):
wandb.config.update(model.accelerator.parallelism_config._sizes)
Thanks. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39882/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39881 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39881/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39881/comments | https://api.github.com/repos/huggingface/transformers/issues/39881/events | https://github.com/huggingface/transformers/pull/39881 | 3,289,011,492 | PR_kwDOCUB6oc6h-yQM | 39,881 | Better return type hint for `AutoModelForCausalLM` and `AutoModelForImageTextToText` | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8882772041,
"node_id": "LA_kwDOCUB6oc8AAAACEXRYSQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/typing",
"name": "typing",
"color": "DBA272",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-04T11:14:06 | 2025-08-04T15:03:53 | 2025-08-04T15:03:53 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39881",
"html_url": "https://github.com/huggingface/transformers/pull/39881",
"diff_url": "https://github.com/huggingface/transformers/pull/39881.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39881.patch",
"merged_at": "2025-08-04T15:03:53"
} | # What does this PR do?
Defines a better return type hint for `AutoModelForCausalLM` and `AutoModelForImageTextToText`, from_pretrained method returns `_BaseModelWithGenerate` so anyone can see `generate` signature (see images)
#### On main
<img width="757" height="204" alt="Screenshot 2025-08-04 at 12 12 35" src="https://github.com/user-attachments/assets/1cfb7abb-9a82-4521-843f-0a7bec15d8ef" />
#### On branch
<img width="738" height="456" alt="Screenshot 2025-08-04 at 12 12 10" src="https://github.com/user-attachments/assets/8e8f9138-7dd9-4333-995d-88baceb671a1" />
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39881/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39881/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39880 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39880/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39880/comments | https://api.github.com/repos/huggingface/transformers/issues/39880/events | https://github.com/huggingface/transformers/pull/39880 | 3,288,697,615 | PR_kwDOCUB6oc6h9t9A | 39,880 | Fix link to models in README | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T09:28:56 | 2025-08-04T16:34:43 | 2025-08-04T16:34:42 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39880",
"html_url": "https://github.com/huggingface/transformers/pull/39880",
"diff_url": "https://github.com/huggingface/transformers/pull/39880.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39880.patch",
"merged_at": "2025-08-04T16:34:42"
} | # What does this PR do?
Fix links and models in README | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39880/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39880/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39879 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39879/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39879/comments | https://api.github.com/repos/huggingface/transformers/issues/39879/events | https://github.com/huggingface/transformers/pull/39879 | 3,288,588,085 | PR_kwDOCUB6oc6h9WdS | 39,879 | Fix aria tests | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-04T08:52:11 | 2025-08-05T11:48:48 | 2025-08-05T11:48:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39879",
"html_url": "https://github.com/huggingface/transformers/pull/39879",
"diff_url": "https://github.com/huggingface/transformers/pull/39879.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39879.patch",
"merged_at": "2025-08-05T11:48:47"
} | # What does this PR do?
As per ttile | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39879/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39879/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39878 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39878/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39878/comments | https://api.github.com/repos/huggingface/transformers/issues/39878/events | https://github.com/huggingface/transformers/issues/39878 | 3,288,240,850 | I_kwDOCUB6oc7D_pLS | 39,878 | Misleading WandB error when WANDB_DISABLED=True and report_to="wandb" are both set | {
"login": "RayaneA7",
"id": 83666906,
"node_id": "MDQ6VXNlcjgzNjY2OTA2",
"avatar_url": "https://avatars.githubusercontent.com/u/83666906?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RayaneA7",
"html_url": "https://github.com/RayaneA7",
"followers_url": "https://api.github.com/users/RayaneA7/followers",
"following_url": "https://api.github.com/users/RayaneA7/following{/other_user}",
"gists_url": "https://api.github.com/users/RayaneA7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RayaneA7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RayaneA7/subscriptions",
"organizations_url": "https://api.github.com/users/RayaneA7/orgs",
"repos_url": "https://api.github.com/users/RayaneA7/repos",
"events_url": "https://api.github.com/users/RayaneA7/events{/privacy}",
"received_events_url": "https://api.github.com/users/RayaneA7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-04T06:44:43 | 2025-08-05T10:18:21 | 2025-08-05T10:18:21 | NONE | null | null | null | null | ### System Info
When using the Hugging Face `Trainer` or `SFTTrainer` with `report_to="wandb"`, if `WANDB_DISABLED` is also set in the environment, the following misleading error appears:
```bash
Using the WANDB_DISABLED environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none).
RuntimeError: WandbCallback requires wandb to be installed. Run pip install wandb.
```
However, `wandb` is correctly installed — it’s just disabled via the environment variable.
The logic in `is_wandb_available()` (from `integration_utils.py`) silently disables wandb, even if `report_to="wandb"` is explicitly passed.
This creates confusing behavior and error messages for users.
#### 🔍 Root Cause
The error is triggered deep in `transformers/trainer.py` at line **688**:
```python
default_callbacks = DEFAULT_CALLBACKS + get_reporting_integration_callbacks(self.args.report_to)
callbacks = default_callbacks if callbacks is None else default_callbacks + callbacks
self.callback_handler = CallbackHandler(
callbacks, self.model, self.processing_class, self.optimizer, self.lr_scheduler
)
```
The `WandbCallback` includes W\&B integration if `report_to='wandb'`, but that depends on:
```python
# transformers/integrations/integration_utils.py
def is_wandb_available():
# any value of WANDB_DISABLED disables wandb
if os.getenv("WANDB_DISABLED", "").upper() in ENV_VARS_TRUE_VALUES:
logger.warning(
"Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the "
"--report_to flag to control the integrations used for logging result (for instance --report_to none)."
)
return False
return importlib.util.find_spec("wandb") is not None
```
So even though W\&B is installed, the system **silently disables it** and then later throws an unrelated error saying it's not installed.
### Who can help?
@SunMarc @zach-huggingface
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
import os
import importlib.util
from datasets import Dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from trl import SFTTrainer, SFTConfig
import logging
# Simulate W&B being disabled, even though it's installed
os.environ["WANDB_DISABLED"] = "true"
# Optionally set up logging to see the deprecation warning
logging.basicConfig(level=logging.WARNING)
# Confirm wandb is installed but will be treated as unavailable
print("WandB is installed:", importlib.util.find_spec("wandb") is not None)
# Load tokenizer and model (replace with a smaller model if needed for quick tests)
model_name = "gpt2" # Replace with a small HF model if you don't have access
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
# Create a simple dummy dataset
texts = ["Hello, how are you?", "What's the capital of France?"]
dataset = Dataset.from_dict({"text": texts})
# Tokenize
def tokenize(example):
return tokenizer(example["text"], padding="max_length", truncation=True, max_length=64)
tokenized_dataset = dataset.map(tokenize)
# SFTConfig with WandB enabled, but WANDB_DISABLED is also set
training_args = SFTConfig(
output_dir="./outputs",
per_device_train_batch_size=1,
num_train_epochs=1,
logging_steps=1,
learning_rate=1e-5,
report_to="wandb", # 👈 This triggers the issue
run_name="test-wandb-conflict",
)
# Trainer setup
trainer = SFTTrainer(
model=model,
args=training_args,
train_dataset=tokenized_dataset,
tokenizer=tokenizer,
)
# This will trigger the warning and the misleading RuntimeError
trainer.train()
```
---
### Expected behavior
Instead of showing:
```text
RuntimeError: WandbCallback requires wandb to be installed. Run pip install wandb.
```
the expected behavior should be:
> **Expected Behavior**
> If both `report_to="wandb"` and `WANDB_DISABLED` are set, the Trainer should raise a clear and actionable error like:
>
> ```
> You specified `report_to='wandb'` but also set the `WANDB_DISABLED` environment variable.
> This disables wandb logging, even though it was explicitly requested.
>
> - To enable wandb logging: unset `WANDB_DISABLED`.
> - To disable logging: use `report_to='none'`.
>
> Note: WANDB_DISABLED is deprecated and will be removed in v5.
> ```
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39878/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39878/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39877 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39877/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39877/comments | https://api.github.com/repos/huggingface/transformers/issues/39877/events | https://github.com/huggingface/transformers/issues/39877 | 3,287,468,930 | I_kwDOCUB6oc7D8suC | 39,877 | ValueError: Max cache length is not consistent across layers | {
"login": "TheTharz",
"id": 119271523,
"node_id": "U_kgDOBxvwYw",
"avatar_url": "https://avatars.githubusercontent.com/u/119271523?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TheTharz",
"html_url": "https://github.com/TheTharz",
"followers_url": "https://api.github.com/users/TheTharz/followers",
"following_url": "https://api.github.com/users/TheTharz/following{/other_user}",
"gists_url": "https://api.github.com/users/TheTharz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TheTharz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TheTharz/subscriptions",
"organizations_url": "https://api.github.com/users/TheTharz/orgs",
"repos_url": "https://api.github.com/users/TheTharz/repos",
"events_url": "https://api.github.com/users/TheTharz/events{/privacy}",
"received_events_url": "https://api.github.com/users/TheTharz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-03T19:25:09 | 2025-08-04T10:02:05 | 2025-08-04T10:01:39 | NONE | null | null | null | null | ### System Info
transformers==4.54.0
torch==2.6.0+cu124
peft==0.16.0
unsloth==2025.8.1
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
https://colab.research.google.com/github/unslothai/notebooks/blob/main/nb/Gemma3_(1B)-GRPO.ipynb
Used above notebook provided in the unsloth documentation in finetuning the Gemma3_(1B)-GPRO.
That notebook also gives the error in the training step.
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
[/usr/local/lib/python3.11/dist-packages/unsloth/models/vision.py](https://localhost:8080/#) in unsloth_base_fast_generate(self, *args, **kwargs)
232 with torch.inference_mode(), autocaster:
--> 233 output = self._old_generate(*args, **kwargs)
234 except:
20 frames
ValueError: Max cache length is not consistent across layers: [512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512]
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
[/usr/local/lib/python3.11/dist-packages/transformers/cache_utils.py](https://localhost:8080/#) in max_cache_len(self)
1255 values = [layer.max_cache_len for layer in self.layers]
1256 if len(set(values)) > 1:
-> 1257 raise ValueError(f"Max cache length is not consistent across layers: {values}")
1258 return values[0]
1259
ValueError: Max cache length is not consistent across layers: [512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512, 512, 512, 512, 868, 512, 512]
### Expected behavior
should handle or pre-validate inconsistent max_cache_len values without crashing | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39877/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39876 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39876/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39876/comments | https://api.github.com/repos/huggingface/transformers/issues/39876/events | https://github.com/huggingface/transformers/pull/39876 | 3,287,414,236 | PR_kwDOCUB6oc6h5hz3 | 39,876 | FP-Quant NVFP4 and Python 3.9 support | {
"login": "BlackSamorez",
"id": 16901341,
"node_id": "MDQ6VXNlcjE2OTAxMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BlackSamorez",
"html_url": "https://github.com/BlackSamorez",
"followers_url": "https://api.github.com/users/BlackSamorez/followers",
"following_url": "https://api.github.com/users/BlackSamorez/following{/other_user}",
"gists_url": "https://api.github.com/users/BlackSamorez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BlackSamorez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BlackSamorez/subscriptions",
"organizations_url": "https://api.github.com/users/BlackSamorez/orgs",
"repos_url": "https://api.github.com/users/BlackSamorez/repos",
"events_url": "https://api.github.com/users/BlackSamorez/events{/privacy}",
"received_events_url": "https://api.github.com/users/BlackSamorez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-03T18:10:34 | 2025-10-01T13:58:54 | 2025-10-01T13:58:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39876",
"html_url": "https://github.com/huggingface/transformers/pull/39876",
"diff_url": "https://github.com/huggingface/transformers/pull/39876.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39876.patch",
"merged_at": "2025-10-01T13:58:23"
} | # What does this PR do?
**Pre-Print about this method [Bridging the Gap Between Promise and Performance for Microscaling FP4 Quantization](https://arxiv.org/abs/2509.23202)**
This PR adds a new data-type support to the existing FP-Quant Quantization method.
It also fixes the quantization tests docker environment by actually allowing it to install in Python 3.9, as asked by @SunMarc in #38696.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39876/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39876/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39875 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39875/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39875/comments | https://api.github.com/repos/huggingface/transformers/issues/39875/events | https://github.com/huggingface/transformers/pull/39875 | 3,287,402,759 | PR_kwDOCUB6oc6h5fpU | 39,875 | Improve `is_wandb_available` function to verify WandB installation | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-03T17:53:25 | 2025-08-04T06:22:53 | 2025-08-04T06:22:52 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39875",
"html_url": "https://github.com/huggingface/transformers/pull/39875",
"diff_url": "https://github.com/huggingface/transformers/pull/39875.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39875.patch",
"merged_at": "2025-08-04T06:22:52"
} | To reproduce:
- Install WandB
- Run an exp
- Uninstall
- check what returns `is_wandb_avaialble()` (returns `True`)
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39875/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39874 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39874/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39874/comments | https://api.github.com/repos/huggingface/transformers/issues/39874/events | https://github.com/huggingface/transformers/pull/39874 | 3,287,219,663 | PR_kwDOCUB6oc6h47fv | 39,874 | fix: Catch correct ConnectionError for additional_chat_templates | {
"login": "akug",
"id": 12391389,
"node_id": "MDQ6VXNlcjEyMzkxMzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/12391389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akug",
"html_url": "https://github.com/akug",
"followers_url": "https://api.github.com/users/akug/followers",
"following_url": "https://api.github.com/users/akug/following{/other_user}",
"gists_url": "https://api.github.com/users/akug/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akug/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akug/subscriptions",
"organizations_url": "https://api.github.com/users/akug/orgs",
"repos_url": "https://api.github.com/users/akug/repos",
"events_url": "https://api.github.com/users/akug/events{/privacy}",
"received_events_url": "https://api.github.com/users/akug/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-03T14:18:09 | 2025-08-18T16:27:25 | 2025-08-18T16:25:47 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39874",
"html_url": "https://github.com/huggingface/transformers/pull/39874",
"diff_url": "https://github.com/huggingface/transformers/pull/39874.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39874.patch",
"merged_at": "2025-08-18T16:25:47"
} | # What does this PR do?
Catch the correct `ConnectionError` when checking for `additional_chat_templates`.
Fixes #39873
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39874/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39874/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39873 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39873/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39873/comments | https://api.github.com/repos/huggingface/transformers/issues/39873/events | https://github.com/huggingface/transformers/issues/39873 | 3,287,218,033 | I_kwDOCUB6oc7D7vdx | 39,873 | Checking for additional_chat_templates doesn't work without internet (ConnectionError) | {
"login": "akug",
"id": 12391389,
"node_id": "MDQ6VXNlcjEyMzkxMzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/12391389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akug",
"html_url": "https://github.com/akug",
"followers_url": "https://api.github.com/users/akug/followers",
"following_url": "https://api.github.com/users/akug/following{/other_user}",
"gists_url": "https://api.github.com/users/akug/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akug/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akug/subscriptions",
"organizations_url": "https://api.github.com/users/akug/orgs",
"repos_url": "https://api.github.com/users/akug/repos",
"events_url": "https://api.github.com/users/akug/events{/privacy}",
"received_events_url": "https://api.github.com/users/akug/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-03T14:16:26 | 2025-08-18T16:25:47 | 2025-08-18T16:25:47 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.55.0.dev0
- Platform: Linux-6.6.87.2-microsoft-standard-WSL2-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
This bug only occurs in the following scenario:
1. The user uses a model that uses a tokenizer that can have `additional_chat_templates`
2. The user runs the code without internet connection
3. The user did NOT set HF_HUB_OFFLINE=1, so offline mode is not enabled.
One example is running the [train_dreambooth_lora_sdxl.py](https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/train_dreambooth_lora_sdxl.py) script without any adaptations, like this:
1. Download the SDXL model files and the vae model files into the cache
2. Turn off internet
3. Check that HF_HUB_OFFLINE is unset
4. Run the script
The error signature:
```
requests.exceptions.ConnectionError: (MaxRetryError('HTTPSConnectionPool(...): Max retries exceeded with url: /api/models/stabilityai/stable-diffusion-xl-base-1.0/tree/main/additional_chat_templates?recursive=False&expand=False (Caused by ...)))
```
### Expected behavior
The script works without issues, because all files exist in the cache. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39873/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39873/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39872 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39872/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39872/comments | https://api.github.com/repos/huggingface/transformers/issues/39872/events | https://github.com/huggingface/transformers/issues/39872 | 3,286,959,527 | I_kwDOCUB6oc7D6wWn | 39,872 | InternVL, PerceptionLM inference freeze in 4.54.1 | {
"login": "iglaweb",
"id": 3032604,
"node_id": "MDQ6VXNlcjMwMzI2MDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/3032604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iglaweb",
"html_url": "https://github.com/iglaweb",
"followers_url": "https://api.github.com/users/iglaweb/followers",
"following_url": "https://api.github.com/users/iglaweb/following{/other_user}",
"gists_url": "https://api.github.com/users/iglaweb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iglaweb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iglaweb/subscriptions",
"organizations_url": "https://api.github.com/users/iglaweb/orgs",
"repos_url": "https://api.github.com/users/iglaweb/repos",
"events_url": "https://api.github.com/users/iglaweb/events{/privacy}",
"received_events_url": "https://api.github.com/users/iglaweb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-03T08:52:12 | 2025-09-12T08:02:49 | 2025-09-12T08:02:49 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.54.1
- Platform: Windows-10-10.0.26100-SP0
- Python version: 3.10.18
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.5.1+cu118 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA GeForce RTX 4080
### Who can help?
@qubvel @zucchini-nlp @ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Steps to reproduce the issue:
It works smoothly with transformers 4.53.2. Once I upgrade transformers to 4.54.1, inference freezes and never ends.
Run inference for `OpenGVLab/InternVL3-8B-hf` or `facebook/Perception-LM-3B` model on a video file (w/ cuda).
Full log and exception:
```console
Traceback (most recent call last):
File "D:\vlm_exp\hf_vlm_run_exps_eval.py", line 972, in <module>
run_main()
File "D:\vlm_exp\hf_vlm_run_exps_eval.py", line 956, in run_main
answers_dict = exec_llm_on_segment_videos(
File "D:\vlm_exp\hf_vlm_run_exps_eval.py", line 609, in exec_llm_on_segment_videos
ret_dict = run_llm_on_vid_segments(
File "D:\vlm_exp\hf_vlm_run_exps_eval.py", line 631, in run_llm_on_vid_segments
out_dict = extract_answer_from_llm(
File "D:\vlm_exp\hf_vlm_run_exps_eval.py", line 679, in extract_answer_from_llm
output = hf_base_model_wrapper.run_model_single_inference(
File "D:\vlm_exp\hf_base_model_wrapper.py", line 96, in run_model_single_inference
outputs = model.generate(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\generation\utils.py", line 2633, in generate
result = self._sample(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\generation\utils.py", line 3617, in _sample
outputs = model_forward(**model_inputs, return_dict=True)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\hooks.py", line 175, in new_forward
output = module._old_forward(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\utils\generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\models\internvl\modeling_internvl.py", line 927, in forward
outputs = self.model(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\utils\generic.py", line 961, in wrapper
output = func(self, *args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\models\internvl\modeling_internvl.py", line 706, in forward
outputs = self.language_model(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\utils\generic.py", line 1069, in wrapper
outputs = func(self, *args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\models\qwen2\modeling_qwen2.py", line 379, in forward
hidden_states = decoder_layer(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\modeling_layers.py", line 94, in __call__
return super().__call__(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\hooks.py", line 175, in new_forward
output = module._old_forward(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\models\qwen2\modeling_qwen2.py", line 231, in forward
hidden_states, _ = self.self_attn(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\hooks.py", line 175, in new_forward
output = module._old_forward(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\transformers\models\qwen2\modeling_qwen2.py", line 180, in forward
attn_output = self.o_proj(attn_output)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\hooks.py", line 170, in new_forward
args, kwargs = module._hf_hook.pre_forward(module, *args, **kwargs)
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\hooks.py", line 360, in pre_forward
set_module_tensor_to_device(
File "C:\Users\User\.conda\envs\env_grounded_sam2\lib\site-packages\accelerate\utils\modeling.py", line 343, in set_module_tensor_to_device
new_value = value.to(device, non_blocking=non_blocking)
KeyboardInterrupt
Process finished with exit code -1
```
```python
import torch
from transformers import AutoModelForImageTextToText, AutoProcessor
do_sample = False
max_token_length = 8192
video_path = 'some video file path'
# load model
model_id = 'OpenGVLab/InternVL3-8B-hf'
model = AutoModelForImageTextToText.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map='auto',
)
processor = AutoProcessor.from_pretrained(model_id)
messages = []
messages.append({
"role": "user",
"content": [
{"type": "text", "text": 'Describe a video in detail.'},
{"type": "video", "path": video_path},
],
})
inp_generate_kwargs = {'do_sample_frames': False}
inputs = processor.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
**inp_generate_kwargs
).to(model.device, dtype=model.dtype)
outputs = model.generate(
**inputs,
do_sample=do_sample,
max_new_tokens=max_token_length,
)
```
### Expected behavior
I expect the inference to complete without freeze. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39872/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
} | https://api.github.com/repos/huggingface/transformers/issues/39872/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39871 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39871/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39871/comments | https://api.github.com/repos/huggingface/transformers/issues/39871/events | https://github.com/huggingface/transformers/issues/39871 | 3,286,782,221 | I_kwDOCUB6oc7D6FEN | 39,871 | model.generate custom encoder and decoder outputs/inputs | {
"login": "agu18dec",
"id": 121401992,
"node_id": "U_kgDOBzxyiA",
"avatar_url": "https://avatars.githubusercontent.com/u/121401992?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/agu18dec",
"html_url": "https://github.com/agu18dec",
"followers_url": "https://api.github.com/users/agu18dec/followers",
"following_url": "https://api.github.com/users/agu18dec/following{/other_user}",
"gists_url": "https://api.github.com/users/agu18dec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/agu18dec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/agu18dec/subscriptions",
"organizations_url": "https://api.github.com/users/agu18dec/orgs",
"repos_url": "https://api.github.com/users/agu18dec/repos",
"events_url": "https://api.github.com/users/agu18dec/events{/privacy}",
"received_events_url": "https://api.github.com/users/agu18dec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-08-03T07:10:58 | 2025-08-03T08:33:32 | 2025-08-03T08:33:32 | NONE | null | null | null | null | ### Feature request
For encoder decoder models, I would like to have the following feature where I can generate using:
1. encoded representations of my context tokens passed in while
2. putting tokens of ONLY my question related to the context in decoder_input_ids
### Motivation
I think this is broadly useful to understand how encoder decoder models operate also how cross attention works between the decoder token generation and the relevant encoded context (for e.g. what tokens are retrieved etc.).
I dont want to put the entirety of my question + context in my input_ids parameter and I don't know if there exists an easy way to do this right now. Would be super useful. Thank you!
### Your contribution
N/A | {
"login": "agu18dec",
"id": 121401992,
"node_id": "U_kgDOBzxyiA",
"avatar_url": "https://avatars.githubusercontent.com/u/121401992?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/agu18dec",
"html_url": "https://github.com/agu18dec",
"followers_url": "https://api.github.com/users/agu18dec/followers",
"following_url": "https://api.github.com/users/agu18dec/following{/other_user}",
"gists_url": "https://api.github.com/users/agu18dec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/agu18dec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/agu18dec/subscriptions",
"organizations_url": "https://api.github.com/users/agu18dec/orgs",
"repos_url": "https://api.github.com/users/agu18dec/repos",
"events_url": "https://api.github.com/users/agu18dec/events{/privacy}",
"received_events_url": "https://api.github.com/users/agu18dec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39871/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39871/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39870 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39870/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39870/comments | https://api.github.com/repos/huggingface/transformers/issues/39870/events | https://github.com/huggingface/transformers/pull/39870 | 3,286,731,281 | PR_kwDOCUB6oc6h3edK | 39,870 | Remove unnecessary CUDA sync in qwen2_5_vl | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-03T06:09:41 | 2025-08-05T08:54:16 | 2025-08-05T08:54:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39870",
"html_url": "https://github.com/huggingface/transformers/pull/39870",
"diff_url": "https://github.com/huggingface/transformers/pull/39870.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39870.patch",
"merged_at": "2025-08-05T08:54:16"
} | # What does this PR do?
Remove a sync point in qwen2.5 vl training. See its overhead collected in my SFT using py-spy (other lines are omitted):
```
%Own %Total OwnTime TotalTime Function (filename:line)
27.00% 27.00% 16.44s 16.44s forward (transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py:248)
```
| {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39870/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39870/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39869 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39869/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39869/comments | https://api.github.com/repos/huggingface/transformers/issues/39869/events | https://github.com/huggingface/transformers/pull/39869 | 3,286,678,975 | PR_kwDOCUB6oc6h3S8s | 39,869 | Update README.md | {
"login": "ReuelAlbert-Dev",
"id": 66895085,
"node_id": "MDQ6VXNlcjY2ODk1MDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/66895085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ReuelAlbert-Dev",
"html_url": "https://github.com/ReuelAlbert-Dev",
"followers_url": "https://api.github.com/users/ReuelAlbert-Dev/followers",
"following_url": "https://api.github.com/users/ReuelAlbert-Dev/following{/other_user}",
"gists_url": "https://api.github.com/users/ReuelAlbert-Dev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ReuelAlbert-Dev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ReuelAlbert-Dev/subscriptions",
"organizations_url": "https://api.github.com/users/ReuelAlbert-Dev/orgs",
"repos_url": "https://api.github.com/users/ReuelAlbert-Dev/repos",
"events_url": "https://api.github.com/users/ReuelAlbert-Dev/events{/privacy}",
"received_events_url": "https://api.github.com/users/ReuelAlbert-Dev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-03T05:11:05 | 2025-08-25T15:21:11 | 2025-08-25T15:21:11 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39869",
"html_url": "https://github.com/huggingface/transformers/pull/39869",
"diff_url": "https://github.com/huggingface/transformers/pull/39869.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39869.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39869/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39869/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39868 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39868/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39868/comments | https://api.github.com/repos/huggingface/transformers/issues/39868/events | https://github.com/huggingface/transformers/issues/39868 | 3,286,263,946 | I_kwDOCUB6oc7D4GiK | 39,868 | Tensor parallelism for GLM-4.5 | {
"login": "fernandaspets",
"id": 106451361,
"node_id": "U_kgDOBlhRoQ",
"avatar_url": "https://avatars.githubusercontent.com/u/106451361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fernandaspets",
"html_url": "https://github.com/fernandaspets",
"followers_url": "https://api.github.com/users/fernandaspets/followers",
"following_url": "https://api.github.com/users/fernandaspets/following{/other_user}",
"gists_url": "https://api.github.com/users/fernandaspets/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fernandaspets/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fernandaspets/subscriptions",
"organizations_url": "https://api.github.com/users/fernandaspets/orgs",
"repos_url": "https://api.github.com/users/fernandaspets/repos",
"events_url": "https://api.github.com/users/fernandaspets/events{/privacy}",
"received_events_url": "https://api.github.com/users/fernandaspets/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-08-02T20:10:33 | 2025-08-04T09:09:37 | null | NONE | null | null | null | null | ### Feature request
I get this error when trying to load in vllm with -tp 2
transformers.models.glm4_moe.modeling_glm4_moe.Glm4MoeModel'> does not support tensor parallel yet!
### Motivation
I can't fit the model in one 96gb gpu
### Your contribution
Happy to test etc. :) | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39868/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39868/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39867 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39867/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39867/comments | https://api.github.com/repos/huggingface/transformers/issues/39867/events | https://github.com/huggingface/transformers/pull/39867 | 3,286,002,106 | PR_kwDOCUB6oc6h1Fhf | 39,867 | Fix default values of getenv | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T15:11:07 | 2025-08-07T23:37:10 | 2025-08-07T17:25:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39867",
"html_url": "https://github.com/huggingface/transformers/pull/39867",
"diff_url": "https://github.com/huggingface/transformers/pull/39867.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39867.patch",
"merged_at": "2025-08-07T17:25:40"
} | # What does this PR do?
The default value of getenv should be a string or None. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39867/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39867/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39866 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39866/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39866/comments | https://api.github.com/repos/huggingface/transformers/issues/39866/events | https://github.com/huggingface/transformers/pull/39866 | 3,285,982,023 | PR_kwDOCUB6oc6h1BqC | 39,866 | make sure model.save_pretrained has the correct is_main_process | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-02T14:40:20 | 2025-08-11T11:31:08 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39866",
"html_url": "https://github.com/huggingface/transformers/pull/39866",
"diff_url": "https://github.com/huggingface/transformers/pull/39866.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39866.patch",
"merged_at": null
} | # What does this PR do?
Now that multiple processes can be involved in saving due to parallelism (see #39693), there are intermittent race conditions since the default to `save_pretrained` is `is_main_process=True`. Stacktrace below that intermittently happens on later checkpoints
```
[rank1]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 3237, in _save_checkpoint
[rank1]: self.save_model(output_dir, _internal_call=True)
[rank1]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 3959, in save_model
[rank1]: self._save(output_dir)
[rank1]: File "/workspace/axolotl/src/axolotl/core/trainers/mixins/distributed_parallel.py", line 20, in _save
[rank1]: super()._save(output_dir, state_dict=state_dict)
[rank1]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py", line 4091, in _save
[rank1]: self.model.save_pretrained(
[rank1]: File "/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/modeling_utils.py", line 4062, in save_pretrained
[rank1]: os.remove(full_filename)
[rank1]: FileNotFoundError: [Errno 2] No such file or directory: './model-out/checkpoint-34/model-00001-of-00004.safetensors'
```
@SunMarc @ArthurZucker @S1ro1
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39866/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39866/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39865 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39865/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39865/comments | https://api.github.com/repos/huggingface/transformers/issues/39865/events | https://github.com/huggingface/transformers/pull/39865 | 3,285,869,684 | PR_kwDOCUB6oc6h0q4P | 39,865 | 🌐 [i18n-KO] Translated `gemma3.md` to Korean | {
"login": "seopp",
"id": 100005890,
"node_id": "U_kgDOBfX4Ag",
"avatar_url": "https://avatars.githubusercontent.com/u/100005890?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seopp",
"html_url": "https://github.com/seopp",
"followers_url": "https://api.github.com/users/seopp/followers",
"following_url": "https://api.github.com/users/seopp/following{/other_user}",
"gists_url": "https://api.github.com/users/seopp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/seopp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/seopp/subscriptions",
"organizations_url": "https://api.github.com/users/seopp/orgs",
"repos_url": "https://api.github.com/users/seopp/repos",
"events_url": "https://api.github.com/users/seopp/events{/privacy}",
"received_events_url": "https://api.github.com/users/seopp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T11:55:31 | 2025-08-13T20:25:20 | 2025-08-13T20:25:20 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39865",
"html_url": "https://github.com/huggingface/transformers/pull/39865",
"diff_url": "https://github.com/huggingface/transformers/pull/39865.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39865.patch",
"merged_at": "2025-08-13T20:25:20"
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `gemma3.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
@yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @pyapyapya
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39865/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39865/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39864 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39864/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39864/comments | https://api.github.com/repos/huggingface/transformers/issues/39864/events | https://github.com/huggingface/transformers/issues/39864 | 3,285,778,372 | I_kwDOCUB6oc7D2P_E | 39,864 | 454545 | {
"login": "qiansheniwang-design",
"id": 224247720,
"node_id": "U_kgDODV2_qA",
"avatar_url": "https://avatars.githubusercontent.com/u/224247720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qiansheniwang-design",
"html_url": "https://github.com/qiansheniwang-design",
"followers_url": "https://api.github.com/users/qiansheniwang-design/followers",
"following_url": "https://api.github.com/users/qiansheniwang-design/following{/other_user}",
"gists_url": "https://api.github.com/users/qiansheniwang-design/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qiansheniwang-design/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qiansheniwang-design/subscriptions",
"organizations_url": "https://api.github.com/users/qiansheniwang-design/orgs",
"repos_url": "https://api.github.com/users/qiansheniwang-design/repos",
"events_url": "https://api.github.com/users/qiansheniwang-design/events{/privacy}",
"received_events_url": "https://api.github.com/users/qiansheniwang-design/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T09:26:53 | 2025-08-06T12:48:58 | 2025-08-06T12:48:58 | NONE | null | null | null | null | null | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39864/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39864/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39863 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39863/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39863/comments | https://api.github.com/repos/huggingface/transformers/issues/39863/events | https://github.com/huggingface/transformers/pull/39863 | 3,285,680,964 | PR_kwDOCUB6oc6h0Ezb | 39,863 | 🌐 [i18n-KO] Translated `chat_extras.md` to Korean | {
"login": "Judy-Choi",
"id": 53294075,
"node_id": "MDQ6VXNlcjUzMjk0MDc1",
"avatar_url": "https://avatars.githubusercontent.com/u/53294075?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Judy-Choi",
"html_url": "https://github.com/Judy-Choi",
"followers_url": "https://api.github.com/users/Judy-Choi/followers",
"following_url": "https://api.github.com/users/Judy-Choi/following{/other_user}",
"gists_url": "https://api.github.com/users/Judy-Choi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Judy-Choi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Judy-Choi/subscriptions",
"organizations_url": "https://api.github.com/users/Judy-Choi/orgs",
"repos_url": "https://api.github.com/users/Judy-Choi/repos",
"events_url": "https://api.github.com/users/Judy-Choi/events{/privacy}",
"received_events_url": "https://api.github.com/users/Judy-Choi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T07:08:32 | 2025-10-16T17:41:04 | 2025-10-16T17:41:04 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39863",
"html_url": "https://github.com/huggingface/transformers/pull/39863",
"diff_url": "https://github.com/huggingface/transformers/pull/39863.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39863.patch",
"merged_at": "2025-10-16T17:41:04"
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `chat_extras.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
@harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39863/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39862 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39862/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39862/comments | https://api.github.com/repos/huggingface/transformers/issues/39862/events | https://github.com/huggingface/transformers/pull/39862 | 3,285,648,019 | PR_kwDOCUB6oc6hz-SA | 39,862 | Update model card for gpt neox japanese | {
"login": "ahnjj",
"id": 23564581,
"node_id": "MDQ6VXNlcjIzNTY0NTgx",
"avatar_url": "https://avatars.githubusercontent.com/u/23564581?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ahnjj",
"html_url": "https://github.com/ahnjj",
"followers_url": "https://api.github.com/users/ahnjj/followers",
"following_url": "https://api.github.com/users/ahnjj/following{/other_user}",
"gists_url": "https://api.github.com/users/ahnjj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ahnjj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ahnjj/subscriptions",
"organizations_url": "https://api.github.com/users/ahnjj/orgs",
"repos_url": "https://api.github.com/users/ahnjj/repos",
"events_url": "https://api.github.com/users/ahnjj/events{/privacy}",
"received_events_url": "https://api.github.com/users/ahnjj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T06:06:45 | 2025-08-19T16:18:47 | 2025-08-19T16:18:47 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39862",
"html_url": "https://github.com/huggingface/transformers/pull/39862",
"diff_url": "https://github.com/huggingface/transformers/pull/39862.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39862.patch",
"merged_at": "2025-08-19T16:18:47"
} | # What does this PR do?
Update the model card gpt neox japanese for https://github.com/huggingface/transformers/issues/36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39862/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39862/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39861 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39861/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39861/comments | https://api.github.com/repos/huggingface/transformers/issues/39861/events | https://github.com/huggingface/transformers/pull/39861 | 3,285,641,105 | PR_kwDOCUB6oc6hz8-d | 39,861 | 🌐 [i18n-KO] Translated grounding-dino.md to Korean | {
"login": "TaskerJang",
"id": 124780552,
"node_id": "U_kgDOB3AACA",
"avatar_url": "https://avatars.githubusercontent.com/u/124780552?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TaskerJang",
"html_url": "https://github.com/TaskerJang",
"followers_url": "https://api.github.com/users/TaskerJang/followers",
"following_url": "https://api.github.com/users/TaskerJang/following{/other_user}",
"gists_url": "https://api.github.com/users/TaskerJang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TaskerJang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TaskerJang/subscriptions",
"organizations_url": "https://api.github.com/users/TaskerJang/orgs",
"repos_url": "https://api.github.com/users/TaskerJang/repos",
"events_url": "https://api.github.com/users/TaskerJang/events{/privacy}",
"received_events_url": "https://api.github.com/users/TaskerJang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T05:53:34 | 2025-08-13T17:01:06 | 2025-08-13T17:01:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39861",
"html_url": "https://github.com/huggingface/transformers/pull/39861",
"diff_url": "https://github.com/huggingface/transformers/pull/39861.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39861.patch",
"merged_at": "2025-08-13T17:01:06"
} | # What does this PR do?
Translated the `grounding-dino.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
@4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39861/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39861/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39860 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39860/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39860/comments | https://api.github.com/repos/huggingface/transformers/issues/39860/events | https://github.com/huggingface/transformers/issues/39860 | 3,285,516,639 | I_kwDOCUB6oc7D1QFf | 39,860 | Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet ?... | {
"login": "MathDC99",
"id": 148547139,
"node_id": "U_kgDOCNqmQw",
"avatar_url": "https://avatars.githubusercontent.com/u/148547139?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MathDC99",
"html_url": "https://github.com/MathDC99",
"followers_url": "https://api.github.com/users/MathDC99/followers",
"following_url": "https://api.github.com/users/MathDC99/following{/other_user}",
"gists_url": "https://api.github.com/users/MathDC99/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MathDC99/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MathDC99/subscriptions",
"organizations_url": "https://api.github.com/users/MathDC99/orgs",
"repos_url": "https://api.github.com/users/MathDC99/repos",
"events_url": "https://api.github.com/users/MathDC99/events{/privacy}",
"received_events_url": "https://api.github.com/users/MathDC99/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-02T02:34:49 | 2025-09-01T11:40:11 | 2025-09-01T11:40:11 | NONE | null | null | null | null |
# ComfyUI Error Report
## Error Details
- **Node ID:** 94
- **Node Type:** DownloadAndLoadFlorence2Model
- **Exception Type:** ValueError
- **Exception Message:** Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\LLM\Florence-2-large-PromptGen-v2.0/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
## Stack Trace
```
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 152, in loadmodel
model = Florence2ForConditionalGeneration.from_pretrained(model_path, attn_implementation=attention, torch_dtype=dtype).to(offload_device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 315, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 4927, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\modeling_florence2.py", line 2537, in __init__
super().__init__(config)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2190, in __init__
self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2730, in _check_and_adjust_attn_implementation
self._flash_attn_2_can_dispatch(is_init_check)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2439, in _flash_attn_2_can_dispatch
raise ValueError(
```
## System Information
- **ComfyUI Version:** 0.3.45
- **Arguments:** ComfyUI\main.py --windows-standalone-build --fast fp16_accumulation --dont-print-server --cuda-malloc --use-pytorch-cross-attention --preview-method taesd --preview-size 768
- **OS:** nt
- **Python Version:** 3.12.10 (tags/v3.12.10:0cc8128, Apr 8 2025, 12:21:36) [MSC v.1943 64 bit (AMD64)]
- **Embedded Python:** true
- **PyTorch Version:** 2.7.1+cu128
## Devices
- **Name:** cuda:0 NVIDIA GeForce RTX 5080 Laptop GPU : cudaMallocAsync
- **Type:** cuda
- **VRAM Total:** 17094475776
- **VRAM Free:** 5860249080
- **Torch VRAM Total:** 9831448576
- **Torch VRAM Free:** 59526648
## Logs
```
2025-08-02T04:26:15.543753 - Adding extra search path checkpoints F:\stable-diffusion-webui-forge\models\Stable-diffusion
2025-08-02T04:26:15.543753 - Adding extra search path configs F:\stable-diffusion-webui-forge\models\Stable-diffusion
2025-08-02T04:26:15.543753 - Adding extra search path vae F:\stable-diffusion-webui-forge\models\VAE
2025-08-02T04:26:15.543753 - Adding extra search path loras F:\stable-diffusion-webui-forge\models\Lora
2025-08-02T04:26:15.543753 - Adding extra search path loras F:\stable-diffusion-webui-forge\models\LyCORIS
2025-08-02T04:26:15.543753 - Adding extra search path upscale_models F:\stable-diffusion-webui-forge\models\ESRGAN
2025-08-02T04:26:15.543753 - Adding extra search path upscale_models F:\stable-diffusion-webui-forge\models\RealESRGAN
2025-08-02T04:26:15.543753 - Adding extra search path upscale_models F:\stable-diffusion-webui-forge\models\SwinIR
2025-08-02T04:26:15.543753 - Adding extra search path embeddings F:\stable-diffusion-webui-forge\embeddings
2025-08-02T04:26:15.543753 - Adding extra search path hypernetworks F:\stable-diffusion-webui-forge\models\hypernetworks
2025-08-02T04:26:15.543753 - Adding extra search path controlnet F:\stable-diffusion-webui-forge\models\ControlNet
2025-08-02T04:26:16.595710 - [START] Security scan2025-08-02T04:26:16.595710 -
2025-08-02T04:26:19.988473 - [DONE] Security scan2025-08-02T04:26:19.988473 -
2025-08-02T04:26:20.210706 - ## ComfyUI-Manager: installing dependencies done.2025-08-02T04:26:20.210706 -
2025-08-02T04:26:20.211707 - ** ComfyUI startup time:2025-08-02T04:26:20.211707 - 2025-08-02T04:26:20.211707 - 2025-08-02 04:26:20.2112025-08-02T04:26:20.211707 -
2025-08-02T04:26:20.211707 - ** Platform:2025-08-02T04:26:20.212206 - 2025-08-02T04:26:20.212206 - Windows2025-08-02T04:26:20.212206 -
2025-08-02T04:26:20.212206 - ** Python version:2025-08-02T04:26:20.212206 - 2025-08-02T04:26:20.212206 - 3.12.10 (tags/v3.12.10:0cc8128, Apr 8 2025, 12:21:36) [MSC v.1943 64 bit (AMD64)]2025-08-02T04:26:20.212715 -
2025-08-02T04:26:20.212780 - ** Python executable:2025-08-02T04:26:20.212850 - 2025-08-02T04:26:20.212850 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\python.exe2025-08-02T04:26:20.212850 -
2025-08-02T04:26:20.212850 - ** ComfyUI Path:2025-08-02T04:26:20.212850 - 2025-08-02T04:26:20.212850 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI2025-08-02T04:26:20.212850 -
2025-08-02T04:26:20.212850 - ** ComfyUI Base Folder Path:2025-08-02T04:26:20.212850 - 2025-08-02T04:26:20.212850 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI2025-08-02T04:26:20.212850 -
2025-08-02T04:26:20.212850 - ** User directory:2025-08-02T04:26:20.213631 - 2025-08-02T04:26:20.213631 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user2025-08-02T04:26:20.213631 -
2025-08-02T04:26:20.213631 - ** ComfyUI-Manager config path:2025-08-02T04:26:20.213631 - 2025-08-02T04:26:20.214134 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-08-02T04:26:20.214134 -
2025-08-02T04:26:20.214134 - ** Log path:2025-08-02T04:26:20.214134 - 2025-08-02T04:26:20.214134 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user\comfyui.log2025-08-02T04:26:20.214134 -
2025-08-02T04:26:21.587117 -
Prestartup times for custom nodes:
2025-08-02T04:26:21.587532 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
2025-08-02T04:26:21.588119 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-easy-use
2025-08-02T04:26:21.588119 - 6.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-08-02T04:26:21.588620 -
2025-08-02T04:26:25.955704 - Checkpoint files will always be loaded safely.
2025-08-02T04:26:26.244304 - Total VRAM 16303 MB, total RAM 32173 MB
2025-08-02T04:26:26.244304 - pytorch version: 2.7.1+cu128
2025-08-02T04:26:26.244805 - Enabled fp16 accumulation.
2025-08-02T04:26:26.244805 - Set vram state to: NORMAL_VRAM
2025-08-02T04:26:26.245815 - Device: cuda:0 NVIDIA GeForce RTX 5080 Laptop GPU : cudaMallocAsync
2025-08-02T04:26:29.809894 - Using pytorch attention
2025-08-02T04:26:42.372207 - Python version: 3.12.10 (tags/v3.12.10:0cc8128, Apr 8 2025, 12:21:36) [MSC v.1943 64 bit (AMD64)]
2025-08-02T04:26:42.372707 - ComfyUI version: 0.3.45
2025-08-02T04:26:42.414641 - ComfyUI frontend version: 1.23.4
2025-08-02T04:26:42.416683 - [Prompt Server] web root: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\comfyui_frontend_package\static
2025-08-02T04:26:44.208163 - Web extensions folder found at C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\web\extensions\ComfyLiterals2025-08-02T04:26:44.208163 -
2025-08-02T04:26:44.248892 - Adding2025-08-02T04:26:44.248892 - 2025-08-02T04:26:44.248892 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes2025-08-02T04:26:44.248892 - 2025-08-02T04:26:44.248892 - to sys.path2025-08-02T04:26:44.249392 -
2025-08-02T04:26:45.050308 -
[36mEfficiency Nodes:[0m Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...[92mSuccess![0m2025-08-02T04:26:45.051308 -
2025-08-02T04:26:45.054809 - Loaded Efficiency nodes from2025-08-02T04:26:45.054809 - 2025-08-02T04:26:45.054809 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\efficiency-nodes-comfyui2025-08-02T04:26:45.054809 -
2025-08-02T04:26:45.063566 - Loaded ControlNetPreprocessors nodes from2025-08-02T04:26:45.063566 - 2025-08-02T04:26:45.063566 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux2025-08-02T04:26:45.063566 -
2025-08-02T04:26:45.065574 - Could not find AdvancedControlNet nodes2025-08-02T04:26:45.065574 -
2025-08-02T04:26:45.068080 - Could not find AnimateDiff nodes2025-08-02T04:26:45.068080 -
2025-08-02T04:26:45.069581 - Could not find IPAdapter nodes2025-08-02T04:26:45.069581 -
2025-08-02T04:26:45.077600 - Could not find VideoHelperSuite nodes2025-08-02T04:26:45.078600 -
2025-08-02T04:26:45.081608 - Could not load ImpactPack nodes2025-08-02T04:26:45.081608 - 2025-08-02T04:26:45.081608 - Could not find ImpactPack nodes2025-08-02T04:26:45.081608 -
2025-08-02T04:26:46.787973 - [Crystools [0;32mINFO[0m] Crystools version: 1.25.2
2025-08-02T04:26:46.830172 - [Crystools [0;32mINFO[0m] Platform release: 11
2025-08-02T04:26:46.830677 - [Crystools [0;32mINFO[0m] JETSON: Not detected.
2025-08-02T04:26:46.832677 - [Crystools [0;32mINFO[0m] CPU: Intel(R) Core(TM) Ultra 9 275HX - Arch: AMD64 - OS: Windows 11
2025-08-02T04:26:46.851928 - [Crystools [0;32mINFO[0m] pynvml (NVIDIA) initialized.
2025-08-02T04:26:46.852428 - [Crystools [0;32mINFO[0m] GPU/s:
2025-08-02T04:26:46.880264 - [Crystools [0;32mINFO[0m] 0) NVIDIA GeForce RTX 5080 Laptop GPU
2025-08-02T04:26:46.881269 - [Crystools [0;32mINFO[0m] NVIDIA Driver: 577.00
2025-08-02T04:26:53.560313 - [34m[ComfyUI-Easy-Use] server: [0mv1.3.1 [92mLoaded[0m2025-08-02T04:26:53.560313 -
2025-08-02T04:26:53.560817 - [34m[ComfyUI-Easy-Use] web root: [0mC:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-easy-use\web_version/v2 [92mLoaded[0m2025-08-02T04:26:53.560817 -
2025-08-02T04:26:54.919575 - ComfyUI-GGUF: Partial torch compile only, consider updating pytorch
2025-08-02T04:26:54.933635 - ### Loading: ComfyUI-Impact-Pack (V8.21.1)
2025-08-02T04:26:55.280430 - ### Loading: ComfyUI-Impact-Subpack (V1.3.5)
2025-08-02T04:26:55.281929 - [Impact Pack] Wildcards loading done.
2025-08-02T04:26:55.285689 - [Impact Pack/Subpack] Using folder_paths to determine whitelist path: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Impact-Subpack\model-whitelist.txt
2025-08-02T04:26:55.285689 - [Impact Pack/Subpack] Ensured whitelist directory exists: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Impact-Subpack
2025-08-02T04:26:55.292768 - [Impact Pack/Subpack] Loaded 0 model(s) from whitelist: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\user\default\ComfyUI-Impact-Subpack\model-whitelist.txt
2025-08-02T04:26:55.449271 - [Impact Subpack] ultralytics_bbox: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\ultralytics\bbox
2025-08-02T04:26:55.449271 - [Impact Subpack] ultralytics_segm: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\ultralytics\segm
2025-08-02T04:26:55.459800 - ### Loading: ComfyUI-Inspire-Pack (V1.21)
2025-08-02T04:26:55.553400 - ### Loading: ComfyUI-Manager (V3.35)
2025-08-02T04:26:55.553899 - [ComfyUI-Manager] network_mode: private
2025-08-02T04:26:55.553899 - [ComfyUI-Manager] Since --preview-method is set, ComfyUI-Manager's preview method feature will be ignored.
2025-08-02T04:26:55.703611 - ### ComfyUI Version: v0.3.45-22-g0621d73a | Released on '2025-07-26'
2025-08-02T04:26:55.911337 - Failed to auto update `Quality of Life Suit` 2025-08-02T04:26:55.912341 -
2025-08-02T04:26:55.914782 - QualityOfLifeSuit_Omar92_DIR:2025-08-02T04:26:55.914782 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar922025-08-02T04:26:55.914782 -
2025-08-02T04:26:55.990070 - ------------------------------------------2025-08-02T04:26:55.990570 -
2025-08-02T04:26:55.990570 - Comfyroll Studio v1.76 : 2025-08-02T04:26:55.991070 - 175 Nodes Loaded2025-08-02T04:26:55.991070 -
2025-08-02T04:26:55.991070 - ------------------------------------------2025-08-02T04:26:55.991571 -
2025-08-02T04:26:55.991571 - ** For changes, please see patch notes at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/blob/main/Patch_Notes.md2025-08-02T04:26:55.991571 -
2025-08-02T04:26:55.991571 - ** For help, please see the wiki at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/wiki2025-08-02T04:26:55.991571 -
2025-08-02T04:26:55.991571 - ------------------------------------------2025-08-02T04:26:55.991571 -
2025-08-02T04:26:56.046686 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-08-02T04:26:56.058250 - [C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux] | INFO -> Using ckpts path: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux\ckpts2025-08-02T04:26:56.058750 -
2025-08-02T04:26:56.059750 - [C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux] | INFO -> Using symlinks: False2025-08-02T04:26:56.059750 -
2025-08-02T04:26:56.060750 - [C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']2025-08-02T04:26:56.061251 -
2025-08-02T04:26:56.081915 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-08-02T04:26:56.130007 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-08-02T04:26:56.164441 -
2025-08-02T04:26:56.164441 - Efficiency Nodes:2025-08-02T04:26:56.164441 - Attempting to add Control Net options to the 'HiRes-Fix Script' Node (comfyui_controlnet_aux add-on)...2025-08-02T04:26:56.164953 - Success!2025-08-02T04:26:56.164953 -
2025-08-02T04:26:56.179634 - FAL_KEY found in environment variables2025-08-02T04:26:56.179634 -
2025-08-02T04:26:56.201704 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-08-02T04:26:56.234749 - JAX version 0.7.0 available.
2025-08-02T04:26:56.288145 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-08-02T04:26:56.289647 - [ComfyUI-Manager] The private comfyregistry is not yet supported in `network_mode=private`.
2025-08-02T04:26:56.289647 - [ComfyUI-Manager] All startup tasks have been completed.
2025-08-02T04:26:56.893409 -
2025-08-02T04:26:56.899908 -
2025-08-02T04:26:56.899908 - INFO2025-08-02T04:26:56.899908 - ENV: Auto setting CUDA_DEVICE_ORDER=PCI_BUS_ID for correctness. 2025-08-02T04:26:56.899908 -
2025-08-02T04:26:57.139969 - Optimum library found. GPTQ model loading enabled (requires suitable backend).2025-08-02T04:26:57.140461 -
2025-08-02T04:26:57.217550 - HiDream: Successfully registered with ComfyUI memory management2025-08-02T04:26:57.217550 -
2025-08-02T04:26:57.217550 - --------------------------------------------------
HiDream Sampler Node Initialized
Available Models: ['full-nf4', 'dev-nf4', 'fast-nf4', 'full', 'dev', 'fast']
--------------------------------------------------2025-08-02T04:26:57.217550 -
2025-08-02T04:26:57.543653 -
2025-08-02T04:26:57.543653 - [rgthree-comfy] Loaded 47 epic nodes. 🎉2025-08-02T04:26:57.543653 -
2025-08-02T04:26:57.543653 -
2025-08-02T04:26:57.573194 - [save_image_extended] AVIF is supported! Woohoo!2025-08-02T04:26:57.575697 -
2025-08-02T04:26:57.577212 - [save_image_extended]2025-08-02T04:26:57.577212 - JXL is not supported. To add it: pip install jxlpy2025-08-02T04:26:57.577632 -
2025-08-02T04:26:57.577632 - [save_image_extended]2025-08-02T04:26:57.577632 - You will need a valid MSVC env to build the wheel2025-08-02T04:26:57.577632 -
2025-08-02T04:26:57.578211 - [save_image_extended]2025-08-02T04:26:57.578712 - version: 2.642025-08-02T04:26:57.578712 -
2025-08-02T04:26:58.557975 - WAS Node Suite: 2025-08-02T04:26:58.557975 - OpenCV Python FFMPEG support is enabled2025-08-02T04:26:58.557975 -
2025-08-02T04:26:58.557975 - WAS Node Suite 2025-08-02T04:26:58.557975 - Warning: 2025-08-02T04:26:58.557975 - `ffmpeg_bin_path` is not set in `C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\was-ns\was_suite_config.json` config file. Will attempt to use system ffmpeg binaries if available.2025-08-02T04:26:58.558474 -
2025-08-02T04:26:58.981421 - WAS Node Suite: 2025-08-02T04:26:58.981922 - Finished.2025-08-02T04:26:58.981922 - 2025-08-02T04:26:58.981922 - Loaded2025-08-02T04:26:58.982421 - 2025-08-02T04:26:58.982421 - 2202025-08-02T04:26:58.982421 - 2025-08-02T04:26:58.982421 - nodes successfully.2025-08-02T04:26:58.982421 -
2025-08-02T04:26:58.982421 -
2025-08-02T04:26:58.982421 - "Do one thing every day that scares you."2025-08-02T04:26:58.982421 - - Eleanor Roosevelt2025-08-02T04:26:58.982421 -
2025-08-02T04:26:58.982921 -
2025-08-02T04:26:58.991655 -
Import times for custom nodes:
2025-08-02T04:26:58.991655 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_JPS-Nodes
2025-08-02T04:26:58.991655 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_ipadapter_plus
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyLiterals
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\efficiency-nodes-comfyui
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-detail-daemon
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_bnb_nf4_fp4_Loaders
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-GGUF
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\cg-use-everywhere
2025-08-02T04:26:58.992154 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_ultimatesdupscale
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Chibi-Nodes
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_essentials
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar92
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-custom-scripts
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\save-image-extended-comfyui
2025-08-02T04:26:58.992653 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\fal-api
2025-08-02T04:26:58.993153 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-advanced-controlnet
2025-08-02T04:26:58.993153 - 0.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2
2025-08-02T04:26:58.993153 - 0.1 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2025-08-02T04:26:58.993153 - 0.1 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-inspire-pack
2025-08-02T04:26:58.993153 - 0.1 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\kaytool
2025-08-02T04:26:58.993153 - 0.1 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux
2025-08-02T04:26:58.993153 - 0.2 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-fal-api-flux
2025-08-02T04:26:58.993153 - 0.2 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-impact-subpack
2025-08-02T04:26:58.993658 - 0.2 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
2025-08-02T04:26:58.993658 - 0.4 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-impact-pack
2025-08-02T04:26:58.993658 - 0.4 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-08-02T04:26:58.993658 - 0.9 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-art-venture
2025-08-02T04:26:58.993658 - 1.0 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\hidream_sampler
2025-08-02T04:26:58.993658 - 1.1 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-fal-connector
2025-08-02T04:26:58.993658 - 1.4 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\was-ns
2025-08-02T04:26:58.993658 - 1.8 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Crystools
2025-08-02T04:26:58.994166 - 6.6 seconds: C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-easy-use
2025-08-02T04:26:58.994166 -
2025-08-02T04:26:59.684507 - Context impl SQLiteImpl.
2025-08-02T04:26:59.685509 - Will assume non-transactional DDL.
2025-08-02T04:26:59.689087 - No target revision found.
2025-08-02T04:27:02.026078 - QualityOfLifeSuit_Omar92:2025-08-02T04:27:02.026078 - :NSP ready2025-08-02T04:27:02.026584 -
2025-08-02T04:30:49.064179 - got prompt
2025-08-02T04:30:49.147559 - Using pytorch attention in VAE
2025-08-02T04:30:49.148564 - Using pytorch attention in VAE
2025-08-02T04:30:49.411220 - VAE load device: cuda:0, offload device: cpu, dtype: torch.bfloat16
2025-08-02T04:30:49.554087 - Requested to load AutoencodingEngine
2025-08-02T04:30:49.603333 - loaded completely 3756.0 159.87335777282715 True
2025-08-02T04:30:51.460614 - Requested to load FluxClipModel_
2025-08-02T04:30:51.520168 - loaded completely 9.5367431640625e+25 9319.23095703125 True
2025-08-02T04:30:51.522559 - CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cuda:0, dtype: torch.float16
2025-08-02T04:31:03.049673 - clip missing: ['text_projection.weight']
2025-08-02T04:31:04.666074 - model weight dtype torch.float8_e4m3fn, manual cast: torch.float16
2025-08-02T04:31:04.670082 - model_type FLUX
2025-08-02T04:31:14.080257 - Florence2 using sdpa for attention2025-08-02T04:31:14.087772 -
2025-08-02T04:31:14.699028 - C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\timm\models\layers\__init__.py:48: FutureWarning: Importing from timm.models.layers is deprecated, please import via timm.layers
warnings.warn(f"Importing from {__name__} is deprecated, please import via timm.layers", FutureWarning)
2025-08-02T04:31:14.747195 - !!! Exception during processing !!! 'Florence2ForConditionalGeneration' object has no attribute '_supports_sdpa'
2025-08-02T04:31:14.785939 - Traceback (most recent call last):
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 152, in loadmodel
model = Florence2ForConditionalGeneration.from_pretrained(model_path, attn_implementation=attention, torch_dtype=dtype).to(offload_device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 315, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 4927, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\modeling_florence2.py", line 2537, in __init__
super().__init__(config)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2190, in __init__
self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2738, in _check_and_adjust_attn_implementation
self._sdpa_can_dispatch(is_init_check)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2602, in _sdpa_can_dispatch
if not self._supports_sdpa:
^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1940, in __getattr__
raise AttributeError(
AttributeError: 'Florence2ForConditionalGeneration' object has no attribute '_supports_sdpa'
2025-08-02T04:31:14.794964 - Prompt executed in 25.72 seconds
2025-08-02T04:31:28.059867 - got prompt
2025-08-02T04:31:28.443112 - Florence2 using flash_attention_2 for attention2025-08-02T04:31:28.444112 -
2025-08-02T04:31:28.451035 - !!! Exception during processing !!! Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\LLM\Florence-2-large-PromptGen-v2.0/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
2025-08-02T04:31:28.452535 - Traceback (most recent call last):
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 152, in loadmodel
model = Florence2ForConditionalGeneration.from_pretrained(model_path, attn_implementation=attention, torch_dtype=dtype).to(offload_device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 315, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 4927, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\modeling_florence2.py", line 2537, in __init__
super().__init__(config)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2190, in __init__
self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2730, in _check_and_adjust_attn_implementation
self._flash_attn_2_can_dispatch(is_init_check)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2439, in _flash_attn_2_can_dispatch
raise ValueError(
ValueError: Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\LLM\Florence-2-large-PromptGen-v2.0/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
2025-08-02T04:31:28.454035 - Prompt executed in 0.04 seconds
2025-08-02T04:32:13.105584 - got prompt
2025-08-02T04:32:13.123149 - Florence2 using flash_attention_2 for attention2025-08-02T04:32:13.124651 -
2025-08-02T04:32:13.130204 - !!! Exception during processing !!! Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\LLM\Florence-2-large-PromptGen-v2.0/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
2025-08-02T04:32:13.132705 - Traceback (most recent call last):
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 427, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 270, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 244, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\execution.py", line 232, in process_inputs
result = f(**inputs)
^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\nodes.py", line 152, in loadmodel
model = Florence2ForConditionalGeneration.from_pretrained(model_path, attn_implementation=attention, torch_dtype=dtype).to(offload_device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 315, in _wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 4927, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2\modeling_florence2.py", line 2537, in __init__
super().__init__(config)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2190, in __init__
self.config._attn_implementation_internal = self._check_and_adjust_attn_implementation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2730, in _check_and_adjust_attn_implementation
self._flash_attn_2_can_dispatch(is_init_check)
File "C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\modeling_utils.py", line 2439, in _flash_attn_2_can_dispatch
raise ValueError(
ValueError: Florence2ForConditionalGeneration does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/C:\Users\mathd\ComfyUI_windows_portable_nvidia\ComfyUI_windows_portable\ComfyUI\models\LLM\Florence-2-large-PromptGen-v2.0/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new
2025-08-02T04:32:13.134705 - Prompt executed in 0.02 seconds
```
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
```
Workflow too large. Please manually upload the workflow from local file system.
```
## Additional Context
(Please add any additional context or steps to reproduce the error here)
a | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39860/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39860/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39859 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39859/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39859/comments | https://api.github.com/repos/huggingface/transformers/issues/39859/events | https://github.com/huggingface/transformers/pull/39859 | 3,285,266,036 | PR_kwDOCUB6oc6hyug6 | 39,859 | WIP: Initial support for bnb 4bit on any nn.Parameter | {
"login": "matthewdouglas",
"id": 38992547,
"node_id": "MDQ6VXNlcjM4OTkyNTQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/38992547?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matthewdouglas",
"html_url": "https://github.com/matthewdouglas",
"followers_url": "https://api.github.com/users/matthewdouglas/followers",
"following_url": "https://api.github.com/users/matthewdouglas/following{/other_user}",
"gists_url": "https://api.github.com/users/matthewdouglas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matthewdouglas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthewdouglas/subscriptions",
"organizations_url": "https://api.github.com/users/matthewdouglas/orgs",
"repos_url": "https://api.github.com/users/matthewdouglas/repos",
"events_url": "https://api.github.com/users/matthewdouglas/events{/privacy}",
"received_events_url": "https://api.github.com/users/matthewdouglas/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-01T22:24:18 | 2025-10-03T14:23:20 | null | MEMBER | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39859",
"html_url": "https://github.com/huggingface/transformers/pull/39859",
"diff_url": "https://github.com/huggingface/transformers/pull/39859.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39859.patch",
"merged_at": null
} | # What does this PR do?
This PR adds a new option to `BitsAndBytesConfig` called `target_parameters` with the same spirit as `target_parameters` in huggingface/peft#2638. The intent is to allow quantization of `nn.Parameter` that are not within a `nn.Linear`, e.g. those found commonly in certain MoE model implementations.
Requires bitsandbytes-foundation/bitsandbytes#1720 which is released in bitsandbytes v0.48.0.
Example usage with a Granite MoE:
```python
model = GraniteMoeForCausalLM.from_pretrained(
"ibm-granite/granite-3.1-3b-a800m-base",
torch_dtype=torch.bfloat16,
device_map="cuda:0",
quantization_config=BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=False,
target_parameters=["block_sparse_moe.input_linear.weight", "block_sparse_moe.output_linear.weight"],
llm_int8_skip_modules=["lm_head", "block_sparse_moe.router"]
),
)
```
**Memory Usage - BF16**
| Metric | Cur Usage | Peak Usage | Tot Alloc | Tot Freed |
|------------------|--------------|--------------|-----------|-------------|
| Allocated memory | 6291 MiB | 6292 MiB | 12583 MiB | 6292 MiB |
| Active memory | 6291 MiB | 6292 MiB | 12583 MiB | 6292 MiB |
| Requested memory | 6291 MiB | 6291 MiB | 12583 MiB | 6291 MiB |
**Memory Usage - Before PR**
| Metric | Cur Usage | Peak Usage | Tot Alloc | Tot Freed |
|------------------|--------------|--------------|-----------|-------------|
| Allocated memory | 6019 MiB | 6027 MiB | 9935 MiB | 3916 MiB |
| Active memory | 6019 MiB | 6027 MiB | 9935 MiB | 3916 MiB |
| Requested memory | 6015 MiB | 6024 MiB | 9929 MiB | 3913 MiB |
**Memory Usage - After PR**
| Metric | Cur Usage | Peak Usage | Tot Alloc | Tot Freed |
|------------------|--------------|--------------|-----------|-------------|
| Allocated memory | 1894 MiB | 2054 MiB | 9424 MiB | 7530 MiB |
| Active memory | 1894 MiB | 2054 MiB | 9424 MiB | 7530 MiB |
| Requested memory | 1875 MiB | 2035 MiB | 9389 MiB | 7513 MiB |
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ x ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
(See Slack discussion)
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@SunMarc @MekkCyber @BenjaminBossan
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39859/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 4,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39859/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39858 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39858/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39858/comments | https://api.github.com/repos/huggingface/transformers/issues/39858/events | https://github.com/huggingface/transformers/pull/39858 | 3,284,930,624 | PR_kwDOCUB6oc6hxlsD | 39,858 | Replace `Tokenizer` with `PreTrainedTokenizerFast` in `ContinuousBatchProcessor` | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T19:31:24 | 2025-08-04T14:39:21 | 2025-08-04T14:39:19 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39858",
"html_url": "https://github.com/huggingface/transformers/pull/39858",
"diff_url": "https://github.com/huggingface/transformers/pull/39858.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39858.patch",
"merged_at": "2025-08-04T14:39:19"
} |
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39858/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39858/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39857 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39857/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39857/comments | https://api.github.com/repos/huggingface/transformers/issues/39857/events | https://github.com/huggingface/transformers/pull/39857 | 3,284,796,690 | PR_kwDOCUB6oc6hxIY6 | 39,857 | Ruff format & ruff check --fix | {
"login": "jackzhxng",
"id": 32371937,
"node_id": "MDQ6VXNlcjMyMzcxOTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackzhxng",
"html_url": "https://github.com/jackzhxng",
"followers_url": "https://api.github.com/users/jackzhxng/followers",
"following_url": "https://api.github.com/users/jackzhxng/following{/other_user}",
"gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions",
"organizations_url": "https://api.github.com/users/jackzhxng/orgs",
"repos_url": "https://api.github.com/users/jackzhxng/repos",
"events_url": "https://api.github.com/users/jackzhxng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackzhxng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T18:35:23 | 2025-08-06T15:52:33 | 2025-08-06T15:52:32 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39857",
"html_url": "https://github.com/huggingface/transformers/pull/39857",
"diff_url": "https://github.com/huggingface/transformers/pull/39857.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39857.patch",
"merged_at": null
} | # What does this PR do?
`ruff format`
`ruff check --fix`
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests? | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39857/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39857/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39856 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39856/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39856/comments | https://api.github.com/repos/huggingface/transformers/issues/39856/events | https://github.com/huggingface/transformers/pull/39856 | 3,284,762,427 | PR_kwDOCUB6oc6hxAyi | 39,856 | Fix DeepSpeed mixed precision precedence over Accelerate defaults | {
"login": "notkisk",
"id": 107971634,
"node_id": "U_kgDOBm-EMg",
"avatar_url": "https://avatars.githubusercontent.com/u/107971634?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/notkisk",
"html_url": "https://github.com/notkisk",
"followers_url": "https://api.github.com/users/notkisk/followers",
"following_url": "https://api.github.com/users/notkisk/following{/other_user}",
"gists_url": "https://api.github.com/users/notkisk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/notkisk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/notkisk/subscriptions",
"organizations_url": "https://api.github.com/users/notkisk/orgs",
"repos_url": "https://api.github.com/users/notkisk/repos",
"events_url": "https://api.github.com/users/notkisk/events{/privacy}",
"received_events_url": "https://api.github.com/users/notkisk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T18:21:11 | 2025-09-24T11:22:36 | 2025-09-11T07:12:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39856",
"html_url": "https://github.com/huggingface/transformers/pull/39856",
"diff_url": "https://github.com/huggingface/transformers/pull/39856.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39856.patch",
"merged_at": "2025-09-11T07:12:16"
} | ## Summary
Fixes issue [[#39849](https://github.com/huggingface/transformers/issues/39849)] where Accelerate would default to `bf16` mixed precision even when a DeepSpeed config specifies `fp16`, causing the following error:
> `ValueError: --mixed_precision arg cannot be set to bf16 when fp16 is set in the DeepSpeed config file.`
This PR ensures that DeepSpeed configuration takes precedence over `TrainingArguments` defaults while preserving explicit user settings.
---
## Root Cause
The issue was caused by the initialization order in `TrainingArguments.__post_init__()`. The `ACCELERATE_MIXED_PRECISION` environment variable was being set **before** the DeepSpeed config was processed, preventing it from overriding Accelerate’s defaults.
---
## Changes Made
### 1. Added DeepSpeed Config Override Logic
* Added `override_training_args_from_deepspeed()` method to `HfTrainerDeepSpeedConfig` class.
* This method checks DeepSpeed config for `fp16`/`bf16` settings and overrides `TrainingArguments` defaults accordingly.
* Explicit user choices are preserved, but DeepSpeed config can override defaults if no user input is provided.
### 2. Fixed Initialization Order
* Moved the mixed precision environment variable setting in `TrainingArguments.__post_init__()` to occur **after** DeepSpeed config processing.
* Ensures DeepSpeed config overrides are applied **before** environment variables are set.
---
## Behavior
The fix enforces the following **precedence hierarchy**:
1. **Explicit user settings** – Highest priority
E.g., `fp16=True` or `bf16=True` passed by user.
2. **DeepSpeed config** – Medium priority
E.g., `"fp16": {"enabled": true}` or `"bf16": {"enabled": true}` in config file.
3. **TrainingArguments defaults** – Lowest priority
---
## Test Plan
* ✅ Verified the original reproduction case no longer fails.
* ✅ Tested that DeepSpeed `fp16` config overrides default correctly.
* ✅ Tested that DeepSpeed `bf16` config overrides default correctly.
* ✅ Confirmed explicit user settings take precedence over DeepSpeed config.
* ✅ Ensured environment variables are set correctly in all scenarios.
* ✅ Ran existing DeepSpeed test suite to check for regressions.
* ✅ Rebased on latest `main` and verified fix still works.
---
## Files Modified
* `src/transformers/integrations/deepspeed.py` – Added override logic and method call.
* `src/transformers/training_args.py` – Reordered mixed precision env var setup.
---
## Branch Info
* **PR Branch:** `fix-deepspeed-mixed-precision-precedence` (rebased on latest `main`)
* **Base Branch:** `main`
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39856/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39856/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39855 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39855/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39855/comments | https://api.github.com/repos/huggingface/transformers/issues/39855/events | https://github.com/huggingface/transformers/pull/39855 | 3,284,690,970 | PR_kwDOCUB6oc6hwxNO | 39,855 | Fix attn_implementation setter for models with `backbone_config` | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1834056761,
"node_id": "MDU6TGFiZWwxODM0MDU2NzYx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Core:%20Modeling",
"name": "Core: Modeling",
"color": "FF8446",
"default": false,
"description": "Internals of the library; Models."
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T17:49:10 | 2025-08-04T10:35:10 | 2025-08-04T10:35:10 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39855",
"html_url": "https://github.com/huggingface/transformers/pull/39855",
"diff_url": "https://github.com/huggingface/transformers/pull/39855.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39855.patch",
"merged_at": "2025-08-04T10:35:10"
} | # What does this PR do?
Fix attn_implementation setter for models with `backbone_config`
#### On main:
```python
from transformers import VitPoseForPoseEstimation
model = VitPoseForPoseEstimation.from_pretrained(
"usyd-community/vitpose-base-simple", attn_implementation="eager"
)
print(model.config._attn_implementation)
# "eager"
print(model.config.backbone_config._attn_implementation)
# "sdpa"
```
#### On branch:
```python
from transformers import VitPoseForPoseEstimation
model = VitPoseForPoseEstimation.from_pretrained(
"usyd-community/vitpose-base-simple", attn_implementation="eager"
)
print(model.config._attn_implementation)
# "eager"
print(model.config.backbone_config._attn_implementation)
# "eager"
``` | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39855/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39855/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39854 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39854/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39854/comments | https://api.github.com/repos/huggingface/transformers/issues/39854/events | https://github.com/huggingface/transformers/pull/39854 | 3,284,666,756 | PR_kwDOCUB6oc6hwr77 | 39,854 | Fix DeepSpeed mixed precision precedence over Accelerate defaults | {
"login": "notkisk",
"id": 107971634,
"node_id": "U_kgDOBm-EMg",
"avatar_url": "https://avatars.githubusercontent.com/u/107971634?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/notkisk",
"html_url": "https://github.com/notkisk",
"followers_url": "https://api.github.com/users/notkisk/followers",
"following_url": "https://api.github.com/users/notkisk/following{/other_user}",
"gists_url": "https://api.github.com/users/notkisk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/notkisk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/notkisk/subscriptions",
"organizations_url": "https://api.github.com/users/notkisk/orgs",
"repos_url": "https://api.github.com/users/notkisk/repos",
"events_url": "https://api.github.com/users/notkisk/events{/privacy}",
"received_events_url": "https://api.github.com/users/notkisk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T17:38:32 | 2025-08-01T17:40:28 | 2025-08-01T17:40:28 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39854",
"html_url": "https://github.com/huggingface/transformers/pull/39854",
"diff_url": "https://github.com/huggingface/transformers/pull/39854.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39854.patch",
"merged_at": null
} | ## Summary
Fixes issue [[#39849](https://github.com/huggingface/transformers/issues/39849)](https://github.com/huggingface/transformers/issues/39849) where Accelerate would default to `bf16` mixed precision even when a DeepSpeed config specifies `fp16`, causing the following error:
> `ValueError: --mixed_precision arg cannot be set to bf16 when fp16 is set in the DeepSpeed config file.`
This PR ensures that DeepSpeed configuration takes precedence over `TrainingArguments` defaults while preserving explicit user settings.
---
## Root Cause
The issue was caused by the initialization order in `TrainingArguments.__post_init__()`. The `ACCELERATE_MIXED_PRECISION` environment variable was being set **before** the DeepSpeed config was processed, preventing it from overriding Accelerate’s defaults.
---
## Changes Made
### 1. Added DeepSpeed Config Override Logic
* Added `override_training_args_from_deepspeed()` method to `HfTrainerDeepSpeedConfig` class.
* This method checks DeepSpeed config for `fp16`/`bf16` settings and overrides `TrainingArguments` defaults accordingly.
* Explicit user choices are preserved, but DeepSpeed config can override defaults if no user input is provided.
### 2. Fixed Initialization Order
* Moved the mixed precision environment variable setting in `TrainingArguments.__post_init__()` to occur **after** DeepSpeed config processing.
* Ensures DeepSpeed config overrides are applied **before** environment variables are set.
---
## Behavior
The fix enforces the following **precedence hierarchy**:
1. **Explicit user settings** – Highest priority
E.g., `fp16=True` or `bf16=True` passed by user.
2. **DeepSpeed config** – Medium priority
E.g., `"fp16": {"enabled": true}` or `"bf16": {"enabled": true}` in config file.
3. **TrainingArguments defaults** – Lowest priority
---
## Test Plan
* ✅ Verified the original reproduction case no longer fails.
* ✅ Tested that DeepSpeed `fp16` config overrides default correctly.
* ✅ Tested that DeepSpeed `bf16` config overrides default correctly.
* ✅ Confirmed explicit user settings take precedence over DeepSpeed config.
* ✅ Ensured environment variables are set correctly in all scenarios.
* ✅ Ran existing DeepSpeed test suite to check for regressions.
* ✅ Rebased on latest `main` and verified fix still works.
---
## Files Modified
* `src/transformers/integrations/deepspeed.py` – Added override logic and method call.
* `src/transformers/training_args.py` – Reordered mixed precision env var setup.
---
## Branch Info
* **PR Branch:** `fix-deepspeed-mixed-precision-precedence` (rebased on latest `main`)
* **Base Branch:** `main`
| {
"login": "notkisk",
"id": 107971634,
"node_id": "U_kgDOBm-EMg",
"avatar_url": "https://avatars.githubusercontent.com/u/107971634?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/notkisk",
"html_url": "https://github.com/notkisk",
"followers_url": "https://api.github.com/users/notkisk/followers",
"following_url": "https://api.github.com/users/notkisk/following{/other_user}",
"gists_url": "https://api.github.com/users/notkisk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/notkisk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/notkisk/subscriptions",
"organizations_url": "https://api.github.com/users/notkisk/orgs",
"repos_url": "https://api.github.com/users/notkisk/repos",
"events_url": "https://api.github.com/users/notkisk/events{/privacy}",
"received_events_url": "https://api.github.com/users/notkisk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39854/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39854/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39853 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39853/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39853/comments | https://api.github.com/repos/huggingface/transformers/issues/39853/events | https://github.com/huggingface/transformers/issues/39853 | 3,284,658,342 | I_kwDOCUB6oc7Dx-im | 39,853 | `make fixup` can't find PLC1802 | {
"login": "jackzhxng",
"id": 32371937,
"node_id": "MDQ6VXNlcjMyMzcxOTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackzhxng",
"html_url": "https://github.com/jackzhxng",
"followers_url": "https://api.github.com/users/jackzhxng/followers",
"following_url": "https://api.github.com/users/jackzhxng/following{/other_user}",
"gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions",
"organizations_url": "https://api.github.com/users/jackzhxng/orgs",
"repos_url": "https://api.github.com/users/jackzhxng/repos",
"events_url": "https://api.github.com/users/jackzhxng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackzhxng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T17:34:55 | 2025-09-09T08:02:51 | 2025-09-09T08:02:51 | CONTRIBUTOR | null | null | null | null | When running `make fixup` -
```
Checking/fixing files: src/transformers/integrations/executorch.py tests/models/cohere2/test_modeling_cohere2.py tests/models/exaone4/test_modeling_exaone4.py tests/models/gemma/test_modeling_gemma.py tests/models/gemma2/test_modeling_gemma2.py tests/models/gemma3/test_modeling_gemma3.py tests/models/llama/test_modeling_llama.py tests/models/olmo/test_modeling_olmo.py tests/models/olmo2/test_modeling_olmo2.py tests/models/phi3/test_modeling_phi3.py tests/models/qwen2/test_modeling_qwen2.py tests/models/qwen3/test_modeling_qwen3.py tests/models/smollm3/test_modeling_smollm3.py tests/utils/test_cache_utils.py
ruff failed
Cause: Failed to parse /home/jackzhxng/tr2/transformers/pyproject.toml
Cause: TOML parse error at line 20, column 1
|
20 | [tool.ruff.lint]
| ^^^^^^^^^^^^^^^^
Unknown rule selector: `PLC1802`
ruff failed
Cause: Failed to parse /home/jackzhxng/tr2/transformers/pyproject.toml
Cause: TOML parse error at line 20, column 1
|
20 | [tool.ruff.lint]
| ^^^^^^^^^^^^^^^^
Unknown rule selector: `PLC1802`
make: *** [Makefile:11: modified_only_fixup] Error 2
```
It works if I remove `PLC1802` from `pyproject.toml`
### System Info
```
- `transformers` version: 4.55.0.dev0
- Platform: Linux-6.9.0-0_fbk3_1265_g43ac291a024d-x86_64-with-glibc2.34
- Python version: 3.10.0
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.9.0.dev20250716+cpu (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
```
### Who can help?
@cyyever
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Run `make fixup`
### Expected behavior
No errors | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39853/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39853/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39852 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39852/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39852/comments | https://api.github.com/repos/huggingface/transformers/issues/39852/events | https://github.com/huggingface/transformers/issues/39852 | 3,284,414,766 | I_kwDOCUB6oc7DxDEu | 39,852 | Inconsistent Function calling behaviour by Mistral-7B-Instruct-v0.3 | {
"login": "dvn8weil",
"id": 190058927,
"node_id": "U_kgDOC1QRrw",
"avatar_url": "https://avatars.githubusercontent.com/u/190058927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dvn8weil",
"html_url": "https://github.com/dvn8weil",
"followers_url": "https://api.github.com/users/dvn8weil/followers",
"following_url": "https://api.github.com/users/dvn8weil/following{/other_user}",
"gists_url": "https://api.github.com/users/dvn8weil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dvn8weil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dvn8weil/subscriptions",
"organizations_url": "https://api.github.com/users/dvn8weil/orgs",
"repos_url": "https://api.github.com/users/dvn8weil/repos",
"events_url": "https://api.github.com/users/dvn8weil/events{/privacy}",
"received_events_url": "https://api.github.com/users/dvn8weil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T16:05:12 | 2025-09-09T08:02:52 | 2025-09-09T08:02:52 | NONE | null | null | null | null | ### System Info
output of `transformers env`
```
- `transformers` version: 4.53.2
- Platform: macOS-15.4.1-arm64-arm-64bit
- Python version: 3.12.11
- Huggingface_hub version: 0.33.4
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.0 (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
```
vLLM command for setting up model :
```
vllm serve mistralai/Mistral-7B-Instruct-v0.3 \
--tokenizer-mode mistral \
--load-format mistral \
--config-format mistral \
--tool-call-parser mistral
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
With the model setup with vLLM,
I will first ask the model a non-function call question and then a function call question. In this example : "Can you tell me about Rust programming language, **and** get the weather in San Fransisco , in farheneit ?"
with the cURL :
```
curl --location 'http://0.0.0.0:8000/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer EMPTY' \
--data '{
"model": "mistralai/Mistral-7B-Instruct-v0.3",
"messages": [
{
"role": "user",
"content": "Can you tell me about Rust programming language, and get the weather in San Fransisco , in farheneit"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": [
"celsius",
"fahrenheit"
]
}
},
"required": [
"location"
]
}
}
}
],
"tool_choice": "auto"
}'
```
The response tends to be like :
```
{
"id": "chatcmpl-33aec38bfec74786b3cc5082530d84f0",
"object": "chat.completion",
"created": 1754063137,
"model": "mistralai/Mistral-7B-Instruct-v0.3",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"reasoning_content": null,
"content": " Rust is a multi-paradigm programming language designed for performance and safety, especially safe concurrent programming. It combines syntactic and philosophical influence from several languages, such as C++, Ada, Haskell, OCaml, and others. Rust is open-source and backed by the Mozilla Research organization.\n\nNow, let's get the current weather in San Francisco (CA) in Fahrenheit. Since this is a text-based interaction, I can't actually retrieve real-time data, but I can simulate the API call. Here's an example response in Rust code:\n\n```rust\nuse std::collections::HashMap;\n\nfn main() {\n let location = \"San Francisco, CA\".to_string();\n let unit = \"fahrenheit\".to_string();\n let weather = get_current_weather(&location, &unit);\n\n println!(\"Here is the weather in {}:\", &location);\n println!(\"Temperature: {}\", weather[\"temperature\"].to_string());\n println!(\"Description: {}\", weather[\"description\"].to_string());\n}\n\nfn get_current_weather(location: &str, unit: &str) -> HashMap<String, String> {\n let mut weather = HashMap::new();\n weather.insert(String::from(\"temperature\"), String::from(\"75\"));\n weather.insert(String::from(\"description\"), String::from(\"Partly Cloudy\"));\n return weather;\n}\n```\n\nThis example code creates a simulated `get_current_weather` function and calls it to fetch the weather in San Francisco, CA, in degrees Fahrenheit.",
"tool_calls": []
},
"logprobs": null,
"finish_reason": "stop",
"stop_reason": null
}
],
"usage": {
"prompt_tokens": 134,
"total_tokens": 512,
"completion_tokens": 378,
"prompt_tokens_details": null
},
"prompt_logprobs": null,
"kv_transfer_params": null
}
```
The point to note here is that , it does not predict any function/tool calls but the response for non-function call question is good.
But if the request cURL has the order of questions swapped, i.e. question that requires function calling and then the one that does not , i.e. :
"Can you get the weather in San Fransisco , in farheneit, **and** tell me about Rust programming language"
with the cURL :
```
curl --location 'http://0.0.0.0:8000/v1/chat/completions' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer EMPTY' \
--data '{
"model": "mistralai/Mistral-7B-Instruct-v0.3",
"messages": [
{
"role": "user",
"content": "Can you get the weather in San Fransisco , in farheneit, and tell me about Rust programming language "
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": [
"celsius",
"fahrenheit"
]
}
},
"required": [
"location"
]
}
}
}
],
"tool_choice": "auto"
}'
```
the response is :
```
{
"id": "chatcmpl-8836d744d31b4305a2a2fdb91bba46fe",
"object": "chat.completion",
"created": 1754063571,
"model": "mistralai/Mistral-7B-Instruct-v0.3",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"reasoning_content": null,
"content": "[TOOL_CALLS][{\"name\": \"get_current_weather\", \"arguments\": {\"location\": \"San Fransisco, CA\", \"unit\": \"fahrenheit\"}}],ánd Here are some facts about Rust programming language:\n\n1. Rust is an open-source, multi-paradigm programming language designed and developed by Mozilla Research.\n\n2. It was released in 2010, and its goal is to provide memory safety, concurrency, and performance with a focus on zero-cost abstractions, minimal runtime, and friendly error messages.\n\n3. Rust encourages safety and productivity by applying concepts such as ownership and borrowing, moving values instead of copying them, and providing aflat and immutable data by default.\n\n4. Rust's concurrency model is built around ownership-based synchronization, making data races a thing of the past.\n\n5. Rust's performance is comparable to that of hand-optimized C++ code, and it has a place in the 2020 TIOBE index at position 10.\n\n6. Rust's standard library is designed to cover a wide range of keyboard inputs, filesystem operations, regular expressions, web services, and more.\n\n7. Rust has a rapidly growing ecosystem with packages for everything from web frameworks, game development, embedded systems, machine learning, and more.\n\n8. Rust can be run on various platforms, including Windows, macOS, Linux, Android, and embedded systems like microcontrollers and Raspberry Pi.\n\n9. Rust's documentation is one of its strong points, relying on comprehensive online resources, guides, and the Rust Book.\n\n10. Rust has a strong community of developers, hobbyists, and even students who actively contribute to the language's development, as well as its ecosystem and learning resources.",
"tool_calls": []
},
"logprobs": null,
"finish_reason": "stop",
"stop_reason": null
}
],
"usage": {
"prompt_tokens": 135,
"total_tokens": 545,
"completion_tokens": 410,
"prompt_tokens_details": null
},
"prompt_logprobs": null,
"kv_transfer_params": null
}
```
Even though the model response doesn't have tools output here, the content field has tool calls mentioned that can be parsed into a Tool Call request.
### Expected behavior
The model should not behave significantly different based on the sequence of queries within the same prompt. And it should certainly not ignore the avaiable tool calls, and sugget using external sources directly (here , using weather APIs to get the data). | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39852/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39851 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39851/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39851/comments | https://api.github.com/repos/huggingface/transformers/issues/39851/events | https://github.com/huggingface/transformers/pull/39851 | 3,284,403,625 | PR_kwDOCUB6oc6hvx3S | 39,851 | Allow `TrackioCallback` to work when pynvml is not installed | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T16:01:36 | 2025-08-01T16:57:27 | 2025-08-01T16:57:25 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39851",
"html_url": "https://github.com/huggingface/transformers/pull/39851",
"diff_url": "https://github.com/huggingface/transformers/pull/39851.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39851.patch",
"merged_at": "2025-08-01T16:57:25"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39851/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39851/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39850 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39850/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39850/comments | https://api.github.com/repos/huggingface/transformers/issues/39850/events | https://github.com/huggingface/transformers/issues/39850 | 3,284,374,584 | I_kwDOCUB6oc7Dw5Q4 | 39,850 | Support topNSigma sampling in `generate` | {
"login": "pramodith",
"id": 16939722,
"node_id": "MDQ6VXNlcjE2OTM5NzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/16939722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pramodith",
"html_url": "https://github.com/pramodith",
"followers_url": "https://api.github.com/users/pramodith/followers",
"following_url": "https://api.github.com/users/pramodith/following{/other_user}",
"gists_url": "https://api.github.com/users/pramodith/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pramodith/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pramodith/subscriptions",
"organizations_url": "https://api.github.com/users/pramodith/orgs",
"repos_url": "https://api.github.com/users/pramodith/repos",
"events_url": "https://api.github.com/users/pramodith/events{/privacy}",
"received_events_url": "https://api.github.com/users/pramodith/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-08-01T15:52:02 | 2025-08-05T13:38:24 | null | CONTRIBUTOR | null | null | null | null | ### Feature request
topNSigma is a sampling technique that guarantees that outputs are agnostic to temperature, while eliminating unrealistic tokens thereby reducing the chances of an illegitimate generation.
It's implementation centers around finding the std in the logits space and retaining the tokens whose logit scores are within the top-n standard deviations from the token corresponding to the max logit.
### Motivation
Their sampling technique shows improvements on a number of benchmarks when used with the Llama-3 family of models.
> Extensive experiments across reasoning
and creative writing tasks demonstrate that our
method consistently outperforms existing approaches, with particularly significant improvements in high-temperature settings.
Link to the paper: https://aclanthology.org/2025.acl-long.528.pdf
### Your contribution
I'd love to create a PR for this! | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39850/timeline | null | reopened | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39849 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39849/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39849/comments | https://api.github.com/repos/huggingface/transformers/issues/39849/events | https://github.com/huggingface/transformers/issues/39849 | 3,284,222,207 | I_kwDOCUB6oc7DwUD_ | 39,849 | Accelerate seems to default mixed precision to bf16 when passing a DeepSpeed config. | {
"login": "alexge233",
"id": 6159747,
"node_id": "MDQ6VXNlcjYxNTk3NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6159747?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexge233",
"html_url": "https://github.com/alexge233",
"followers_url": "https://api.github.com/users/alexge233/followers",
"following_url": "https://api.github.com/users/alexge233/following{/other_user}",
"gists_url": "https://api.github.com/users/alexge233/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alexge233/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alexge233/subscriptions",
"organizations_url": "https://api.github.com/users/alexge233/orgs",
"repos_url": "https://api.github.com/users/alexge233/repos",
"events_url": "https://api.github.com/users/alexge233/events{/privacy}",
"received_events_url": "https://api.github.com/users/alexge233/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T14:58:58 | 2025-09-09T08:02:55 | 2025-09-09T08:02:55 | NONE | null | null | null | null | ### System Info
Transformers version 4.54.0
Accelerate version 1.9.0
Deepspeed version 0.17.2
Torch version 2.7.1
Python 3.12.3
AWS 8xGPU Ubuntu=24.04
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
I am using SFTConfig for SFTTrainer, and unless I pass as an argument `fp16` then when I pass a DeepSpeed config, be it a json or a Dict, it raises the following ValueError.
```
File "/home/ubuntu/llm-classifiier/venv/lib/python3.12/site-packages/accelerate/utils/dataclasses.py", line 1361, in set_mixed_precision
raise ValueError(
ValueError: `--mixed_precision` arg cannot be set to `bf16` when `fp16` is set in the DeepSpeed config file.
```
I went through the imported modules, and nowhere do I set `bf16` either in my code, in the ENV vars or otherwise.
At the stage before this error is raised, I can see that the arguments from my config are:
```
{'fp16': {'enabled': True, 'auto_cast': True, 'loss_scale': 0, 'loss_scale_window': 1000, 'initial_scale_power': 16, 'hysteresis': 2, 'min_loss_scale': 1}, 'optimizer': {'type': 'AdamW', 'params': {'lr': 3e-05, 'betas': [0.9, 0.999], 'eps': 1e-08, 'weight_decay': 0.001}}, 'scheduler': {'type': 'WarmupLR', 'params': {'warmup_min_lr': 0, 'warmup_max_lr': 3e-05, 'warmup_num_steps': 'auto'}}, 'zero_optimization': {'stage': 2, 'offload_optimizer': {'device': 'cpu', 'pin_memory': True}, 'allgather_partitions': True, 'allgather_bucket_size': 200000000, 'overlap_comm': True, 'reduce_scatter': True, 'reduce_bucket_size': 200000000, 'contiguous_gradients': True}, 'zero_state': 2, 'gradient_accumulation_steps': 2, 'gradient_clipping': 1, 'train_micro_batch_size_per_gpu': 1, 'mixed_precision': 'fp16', 'steps_per_print': inf}
```
Yet, as an argument of `mixed_precision` it receives `bf16`.
When I pass as an SFTConfig the fllowing:
```python
SFTConfig(
fp16=True,
deepspeed="configs/deepspeed.json"
```
That error goes away.
### Expected behavior
I wouldn't expect the Trainer and arguments to default to `bf16` TBH. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39849/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39849/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39848 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39848/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39848/comments | https://api.github.com/repos/huggingface/transformers/issues/39848/events | https://github.com/huggingface/transformers/pull/39848 | 3,284,171,518 | PR_kwDOCUB6oc6hu_GW | 39,848 | Fix responses add tests | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T14:40:03 | 2025-08-01T16:06:11 | 2025-08-01T16:06:09 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39848",
"html_url": "https://github.com/huggingface/transformers/pull/39848",
"diff_url": "https://github.com/huggingface/transformers/pull/39848.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39848.patch",
"merged_at": "2025-08-01T16:06:09"
} | Completes the `/v1/responses` support in transformers serve to:
- Have `transformers serve` accept a default seed at startup, enables better integration testing
- Correctly have the application of chat template return tensors
- Handle instructions when received
We also add tests for the Responses API, with an end-to-end test. | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39848/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39848/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39847 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39847/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39847/comments | https://api.github.com/repos/huggingface/transformers/issues/39847/events | https://github.com/huggingface/transformers/pull/39847 | 3,284,140,231 | PR_kwDOCUB6oc6hu4Lb | 39,847 | 🚨 [v5] Refactor RoPE for layer types | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T14:29:09 | 2025-10-21T08:04:51 | 2025-10-17T12:57:28 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39847",
"html_url": "https://github.com/huggingface/transformers/pull/39847",
"diff_url": "https://github.com/huggingface/transformers/pull/39847.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39847.patch",
"merged_at": "2025-10-17T12:57:28"
} | # What does this PR do?
This PR enables rope layers to compute different frequencies for different layer types, which will help us to support models like ModernBert without monkey patching config on-the-fly
Main changes:
- In config classes the `rope_parameters` is a required attribute if model has RoPE layers. The attr must be a dict containing `rope_theta` and optionally other parameters to configure rope. In case we want different params per layer type, it should be a nested dict of format `{"full_attn": {**rope_params}, "sliding_attn": {**different_rope_params}}`
- The config attr `rope_scaling` is deprecated in favor of `rope_parameters` and raises warning. The latter name is more descriptive
- Default rope freq computation is moved to the model definition similar to `eager_attention_forward`, and copied with modular in each file
- RoPE layer now looks for layer types in the config and computes `inv_freq` for each type. If the given layer types has no rope parameters saved in config (e.g. `config.rope_scaling` has no key=="sliding_window") we raise an error
- All models copy from rope layers llama when possible, so that changing one file will update it everywhere. Models with layer types copy from gemma2
- Config classes now have typing hint in all language models and the rope scaling attribute is typed with `TypedDict`. It will make our lives easier when we decide to enforce strict type validation on configs
The changes are BC and we will support old-format config files, and standardize it when initializing the config class. The best way to review is to start from `modeling_rope_utils.py` -> `all llama model files` -> `gemma2 and gemma33 model files` -> `tests` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39847/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39847/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39846 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39846/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39846/comments | https://api.github.com/repos/huggingface/transformers/issues/39846/events | https://github.com/huggingface/transformers/pull/39846 | 3,283,950,240 | PR_kwDOCUB6oc6huOVK | 39,846 | Use comment to build doc on PRs | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T13:28:03 | 2025-08-04T08:25:55 | 2025-08-04T08:24:45 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39846",
"html_url": "https://github.com/huggingface/transformers/pull/39846",
"diff_url": "https://github.com/huggingface/transformers/pull/39846.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39846.patch",
"merged_at": "2025-08-04T08:24:45"
} | # What does this PR do?
As discussed offline with @gante | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39846/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39845 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39845/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39845/comments | https://api.github.com/repos/huggingface/transformers/issues/39845/events | https://github.com/huggingface/transformers/pull/39845 | 3,283,870,507 | PR_kwDOCUB6oc6ht8xE | 39,845 | Update ux cb | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T13:03:00 | 2025-08-01T14:50:31 | 2025-08-01T14:50:29 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39845",
"html_url": "https://github.com/huggingface/transformers/pull/39845",
"diff_url": "https://github.com/huggingface/transformers/pull/39845.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39845.patch",
"merged_at": "2025-08-01T14:50:29"
} | Update the good default, computing them
<img width="1705" height="595" alt="image" src="https://github.com/user-attachments/assets/cd864ba3-4fa9-4c10-9d7d-1b6aaee752ec" />
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39845/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39844 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39844/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39844/comments | https://api.github.com/repos/huggingface/transformers/issues/39844/events | https://github.com/huggingface/transformers/pull/39844 | 3,283,781,394 | PR_kwDOCUB6oc6hto6N | 39,844 | [bugfix] fix flash_attention_2 unavailable error on Ascend NPU | {
"login": "FightingZhen",
"id": 26176607,
"node_id": "MDQ6VXNlcjI2MTc2NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/26176607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FightingZhen",
"html_url": "https://github.com/FightingZhen",
"followers_url": "https://api.github.com/users/FightingZhen/followers",
"following_url": "https://api.github.com/users/FightingZhen/following{/other_user}",
"gists_url": "https://api.github.com/users/FightingZhen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FightingZhen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FightingZhen/subscriptions",
"organizations_url": "https://api.github.com/users/FightingZhen/orgs",
"repos_url": "https://api.github.com/users/FightingZhen/repos",
"events_url": "https://api.github.com/users/FightingZhen/events{/privacy}",
"received_events_url": "https://api.github.com/users/FightingZhen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T12:35:42 | 2025-08-14T01:52:14 | 2025-08-06T17:48:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39844",
"html_url": "https://github.com/huggingface/transformers/pull/39844",
"diff_url": "https://github.com/huggingface/transformers/pull/39844.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39844.patch",
"merged_at": "2025-08-06T17:48:52"
} | # What does this PR do?
After PR #39474 merged, flash_attention_2 is unavailable on Ascend NPU, because package `flash-attn` can not be installed on Ascend NPU :(
This PR is committed for solving this problem.
**Detail modification:**
1. Follow newest flash attention preparation logics in `src/transformers/modeling_flash_attention_utils.py` module, use `get_npu_flash_attn_funcs` return flash attention related functions for Ascend NPU.
2. Ignore logics related with package `flash-attn` in func `_flash_attn_2_can_dispatch` from `src/transformers/modeling_utils.py` module, it is not required for Ascend NPU.
Fixes # (issue)
Not related.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39844/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39843 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39843/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39843/comments | https://api.github.com/repos/huggingface/transformers/issues/39843/events | https://github.com/huggingface/transformers/pull/39843 | 3,283,736,342 | PR_kwDOCUB6oc6htfAF | 39,843 | FA2 can continue generation from cache | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T12:19:56 | 2025-08-07T17:26:23 | 2025-08-07T17:26:23 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39843",
"html_url": "https://github.com/huggingface/transformers/pull/39843",
"diff_url": "https://github.com/huggingface/transformers/pull/39843.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39843.patch",
"merged_at": "2025-08-07T17:26:23"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/39814
Don't merge, one of the models fails the test with unknown CUDA-side error and messes up all subsequent tests. Trying to find out with model that is
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39843/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39843/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39842 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39842/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39842/comments | https://api.github.com/repos/huggingface/transformers/issues/39842/events | https://github.com/huggingface/transformers/issues/39842 | 3,283,629,239 | I_kwDOCUB6oc7DuDS3 | 39,842 | Expected behavior of `compute_result` is hard to expect and inconsistent | {
"login": "MilkClouds",
"id": 26109705,
"node_id": "MDQ6VXNlcjI2MTA5NzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MilkClouds",
"html_url": "https://github.com/MilkClouds",
"followers_url": "https://api.github.com/users/MilkClouds/followers",
"following_url": "https://api.github.com/users/MilkClouds/following{/other_user}",
"gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions",
"organizations_url": "https://api.github.com/users/MilkClouds/orgs",
"repos_url": "https://api.github.com/users/MilkClouds/repos",
"events_url": "https://api.github.com/users/MilkClouds/events{/privacy}",
"received_events_url": "https://api.github.com/users/MilkClouds/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T11:43:28 | 2025-10-04T08:02:41 | 2025-10-04T08:02:41 | CONTRIBUTOR | null | null | null | null | In trainer there exists a parameter `compute_result` given to `compute_metrics` when `batch_eval_metrics` is given to True.
https://github.com/huggingface/transformers/blob/1e0665a191f73f6b002209c3dfcda478baac6bac/src/transformers/trainer.py#L370-L375
I think there are several problems for `compute_result`,
1. User can't expect (1) what happen if `batch_eval_metrics` is given (2) what is given to `compute_result` and when it change from True or False (3) what's HF's intention to implement `compute_metrics` with `compute_result`. since there are very few (only 3 line) instruction for this.
2. `compute_metrics` sometimes called with `compute_result` and sometimes not, EVEN WHEN `batch_eval_metrics` is present. See below lines.
https://github.com/huggingface/transformers/blob/1e0665a191f73f6b002209c3dfcda478baac6bac/src/transformers/trainer.py#L4534-L4547
Creating this issue because I spend long time figuring out this. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39842/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39841 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39841/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39841/comments | https://api.github.com/repos/huggingface/transformers/issues/39841/events | https://github.com/huggingface/transformers/issues/39841 | 3,283,180,114 | I_kwDOCUB6oc7DsVpS | 39,841 | MistralCommonTokenizer does not match PreTrainedTokenizer | {
"login": "Fhrozen",
"id": 11988996,
"node_id": "MDQ6VXNlcjExOTg4OTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/11988996?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Fhrozen",
"html_url": "https://github.com/Fhrozen",
"followers_url": "https://api.github.com/users/Fhrozen/followers",
"following_url": "https://api.github.com/users/Fhrozen/following{/other_user}",
"gists_url": "https://api.github.com/users/Fhrozen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Fhrozen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Fhrozen/subscriptions",
"organizations_url": "https://api.github.com/users/Fhrozen/orgs",
"repos_url": "https://api.github.com/users/Fhrozen/repos",
"events_url": "https://api.github.com/users/Fhrozen/events{/privacy}",
"received_events_url": "https://api.github.com/users/Fhrozen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T09:16:24 | 2025-09-10T08:02:52 | 2025-09-10T08:02:52 | NONE | null | null | null | null | ### System Info
on docker
os: ubuntu 24.04
transformers: 4.55.0.dev0
mistral_common: 1.8.3
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Command to lauch container:
```bash
docker run --gpus all -p 8000:8000 --ipc=host vllm/vllm-openai:latest --model mistralai/Voxtral-Mini-3B-2507
```
### Expected behavior
The output will finish in:
```bash
vllm-1 | File "/usr/local/lib/python3.12/dist-packages/vllm/transformers_utils/tokenizer_group.py", line 24, in __init__
vllm-1 | self.tokenizer = get_tokenizer(self.tokenizer_id, **tokenizer_config)
vllm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
vllm-1 | File "/usr/local/lib/python3.12/dist-packages/vllm/transformers_utils/tokenizer.py", line 309, in get_tokenizer
vllm-1 | tokenizer = get_cached_tokenizer(tokenizer)
vllm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
vllm-1 | File "/usr/local/lib/python3.12/dist-packages/vllm/transformers_utils/tokenizer.py", line 104, in get_cached_tokenizer
vllm-1 | tokenizer_all_special_tokens = tokenizer.all_special_tokens
vllm-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
vllm-1 | AttributeError: 'MistralCommonTokenizer' object has no attribute 'all_special_tokens'. Did you mean: '_all_special_ids'?
```
vLLM docker server uses the pretrained tokenizer format:
https://github.com/vllm-project/vllm/blob/49314869887e169be080201ab8bcda14e745c080/vllm/transformers_utils/tokenizer.py#L97-L101
Which must include: `all_special_ids`, `all_special_tokens`, `all_special_tokens_extended` default properties. However, MistralCommonTokenizer does not have implemented them. Is there a plan to standarize both tokenizers?
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39841/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39841/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39840 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39840/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39840/comments | https://api.github.com/repos/huggingface/transformers/issues/39840/events | https://github.com/huggingface/transformers/pull/39840 | 3,283,068,301 | PR_kwDOCUB6oc6hrMYx | 39,840 | remove dtensors, not explicit | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T08:37:18 | 2025-08-01T20:02:49 | 2025-08-01T20:02:47 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39840",
"html_url": "https://github.com/huggingface/transformers/pull/39840",
"diff_url": "https://github.com/huggingface/transformers/pull/39840.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39840.patch",
"merged_at": "2025-08-01T20:02:47"
} | # What does this PR do?
Removed dtensor redistribute for many reasons:
- 2x slower for the add of the bias
- not explicit enough
- does not allow us to run deepspeed
- we were already not using tensors for MoE because of how cumbersome doing a .view was | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39840/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39840/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39839 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39839/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39839/comments | https://api.github.com/repos/huggingface/transformers/issues/39839/events | https://github.com/huggingface/transformers/issues/39839 | 3,282,959,172 | I_kwDOCUB6oc7DrftE | 39,839 | pack_image_features RuntimeError when vision_feature_select_strategy="full" | {
"login": "llnnnnnn",
"id": 224012655,
"node_id": "U_kgDODVopbw",
"avatar_url": "https://avatars.githubusercontent.com/u/224012655?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/llnnnnnn",
"html_url": "https://github.com/llnnnnnn",
"followers_url": "https://api.github.com/users/llnnnnnn/followers",
"following_url": "https://api.github.com/users/llnnnnnn/following{/other_user}",
"gists_url": "https://api.github.com/users/llnnnnnn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/llnnnnnn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/llnnnnnn/subscriptions",
"organizations_url": "https://api.github.com/users/llnnnnnn/orgs",
"repos_url": "https://api.github.com/users/llnnnnnn/repos",
"events_url": "https://api.github.com/users/llnnnnnn/events{/privacy}",
"received_events_url": "https://api.github.com/users/llnnnnnn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-01T07:55:40 | 2025-09-08T08:02:56 | 2025-09-08T08:02:56 | NONE | null | null | null | null | ### System Info
transformers 4.54.0
### Who can help?
@zucchini-nlp
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
from transformers.models.llava_next import LlavaNextForConditionalGeneration, LlavaNextProcessor
from PIL import Image
import requests
import torch
model = LlavaNextForConditionalGeneration.from_pretrained(
"llava-hf/llava-v1.6-vicuna-7b-hf",
vision_feature_select_strategy="full",
torch_dtype=torch.float16,
device_map="auto",
)
processor = LlavaNextProcessor.from_pretrained("llava-hf/llava-v1.6-vicuna-7b-hf")
image = Image.open("/data/coco/train2017/000000000009.jpg")
prompt = "USER: <image>\nWhat is shown in this image? ASSISTANT:"
inputs = processor(images=image, text=prompt, truncation=True, return_tensors="pt", vision_feature_select_strategy = "full").to("cuda")
input_embeds = model(inputs.input_ids, pixel_values=inputs.pixel_values, image_sizes=inputs.image_sizes, vision_feature_select_strategy="full")
```
### Expected behavior
I encountered a bug when running to the line
`input_embeds = model(inputs.input_ids, pixel_values=inputs.pixel_values, image_sizes=inputs.image_sizes, vision_feature_select_strategy="full")`
I got:
```
in pack_image_features
image_feature = image_feature.view(num_patch_height, num_patch_width, height, width, -1)
RuntimeError: shape '[2, 2, 24, 24, -1]' is invalid for input of size 9453568
```
the shape of image_feature is [4, 577, 4096] currently, I want to know how to fix this? | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39839/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39839/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39838 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39838/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39838/comments | https://api.github.com/repos/huggingface/transformers/issues/39838/events | https://github.com/huggingface/transformers/pull/39838 | 3,282,867,693 | PR_kwDOCUB6oc6hqgx9 | 39,838 | Fix tp cb | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-01T07:18:10 | 2025-08-01T07:59:06 | 2025-08-01T07:59:05 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39838",
"html_url": "https://github.com/huggingface/transformers/pull/39838",
"diff_url": "https://github.com/huggingface/transformers/pull/39838.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39838.patch",
"merged_at": "2025-08-01T07:59:05"
} | # What does this PR do?
CB was broken with TP and cudagraph.
Streams were messing with each other!
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39838/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39838/timeline | null | null | null | null | true | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.