url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/38123 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38123/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38123/comments | https://api.github.com/repos/huggingface/transformers/issues/38123/events | https://github.com/huggingface/transformers/pull/38123 | 3,062,338,916 | PR_kwDOCUB6oc6WHzTb | 38,123 | Minor llama4 fixes | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-14T08:50:17 | 2025-05-20T13:15:55 | 2025-05-20T13:15:55 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38123",
"html_url": "https://github.com/huggingface/transformers/pull/38123",
"diff_url": "https://github.com/huggingface/transformers/pull/38123.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38123.patch",
"merged_at": "2025-05-20T13:15:55"
} | # What does this PR do?
Fixes a couple nits/wrong defaults in Llama4 code for DynamicCache init - not sure we should even default to it, I think.
The scaling however has to be passed else `eager_attention_forward" will fail. | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38123/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/38123/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38122 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38122/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38122/comments | https://api.github.com/repos/huggingface/transformers/issues/38122/events | https://github.com/huggingface/transformers/issues/38122 | 3,062,197,481 | I_kwDOCUB6oc62hWzp | 38,122 | Tensor Parallelism with Quantized Models | {
"login": "HuangBugWei",
"id": 67520151,
"node_id": "MDQ6VXNlcjY3NTIwMTUx",
"avatar_url": "https://avatars.githubusercontent.com/u/67520151?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HuangBugWei",
"html_url": "https://github.com/HuangBugWei",
"followers_url": "https://api.github.com/users/HuangBugWei/followers",
"following_url": "https://api.github.com/users/HuangBugWei/following{/other_user}",
"gists_url": "https://api.github.com/users/HuangBugWei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HuangBugWei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HuangBugWei/subscriptions",
"organizations_url": "https://api.github.com/users/HuangBugWei/orgs",
"repos_url": "https://api.github.com/users/HuangBugWei/repos",
"events_url": "https://api.github.com/users/HuangBugWei/events{/privacy}",
"received_events_url": "https://api.github.com/users/HuangBugWei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-14T07:57:42 | 2025-05-14T15:27:03 | 2025-05-14T15:27:03 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.52.0.dev0
- Platform: Linux-5.4.17-2136.339.5.el8uek.x86_64-x86_64-with-glibc2.35
- Python version: 3.10.13
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.2
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: 0.16.3
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA GeForce RTX 3090
### Who can help?
@zucchini-nlp @gante
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
import torch
from transformers import AutoTokenizer, Gemma3ForCausalLM
model_name = "google/gemma-3-12b-it-qat-int4-unquantized"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = Gemma3ForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
# device_map="auto",
quantization_config=bnb_config,
tp_plan="auto",
).eval()
print(model.model.layers[0].mlp.down_proj)
# Linear4bit(in_features=15360, out_features=3840, bias=False)
print(model.model.layers[0].mlp.down_proj.weight)
# DTensor(local_tensor=tensor([[-0.0032, -0.0145, 0.0042, ..., 0.0006, -0.0073, 0.0051],
# [ 0.0015, 0.0056, -0.0059, ..., 0.0033, -0.0056, -0.0012],
# [ 0.0074, 0.0083, -0.0046, ..., 0.0042, 0.0012, -0.0051],
# ...,
# [ 0.0076, -0.0005, -0.0044, ..., -0.0013, 0.0123, 0.0025],
# [-0.0014, -0.0096, 0.0026, ..., -0.0032, -0.0060, -0.0024],
# [ 0.0034, 0.0026, -0.0089, ..., -0.0012, -0.0013, 0.0035]],
# device='cuda:0', dtype=torch.bfloat16), device_mesh=DeviceMesh('cuda', [0, 1]), placements=(Shard(dim=1),))
```
then ```torchrun --nproc-per-node 2 script.py```
This takes about 20GB of GPU memory on each of the two, while the 4-bit model only takes 12GB in total.
### Expected behavior
I'm trying to use tensor parallelism with int4 quantized models.
However, I've noticed that the model weights are being loaded in bf16 format instead of int4 as expected.
I understand that representing int4 can be complex, potentially involving concatenation and storage in int8.
However, I also came across this PR: [https://github.com/huggingface/transformers/pull/37790](https://github.com/huggingface/transformers/pull/37790) which mentions "Fix tensor parallel with non-floating dtypes."
This leads me to wonder:
Is tensor parallelism currently supported for int4 quantized models in Transformers, and the bf16 loading indicates a problem in my script?
Or, is using tensor parallelism with int4 quantized models not currently a supported feature in Transformers due to the complexities of int4 representation?
Any clarification on this would be greatly appreciated.
Thanks!
| {
"login": "HuangBugWei",
"id": 67520151,
"node_id": "MDQ6VXNlcjY3NTIwMTUx",
"avatar_url": "https://avatars.githubusercontent.com/u/67520151?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HuangBugWei",
"html_url": "https://github.com/HuangBugWei",
"followers_url": "https://api.github.com/users/HuangBugWei/followers",
"following_url": "https://api.github.com/users/HuangBugWei/following{/other_user}",
"gists_url": "https://api.github.com/users/HuangBugWei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HuangBugWei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HuangBugWei/subscriptions",
"organizations_url": "https://api.github.com/users/HuangBugWei/orgs",
"repos_url": "https://api.github.com/users/HuangBugWei/repos",
"events_url": "https://api.github.com/users/HuangBugWei/events{/privacy}",
"received_events_url": "https://api.github.com/users/HuangBugWei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38122/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38122/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38121 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38121/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38121/comments | https://api.github.com/repos/huggingface/transformers/issues/38121/events | https://github.com/huggingface/transformers/issues/38121 | 3,062,139,707 | I_kwDOCUB6oc62hIs7 | 38,121 | Emu3 precision regression | {
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-14T07:38:04 | 2025-06-03T05:40:36 | 2025-05-23T07:49:57 | CONTRIBUTOR | null | null | null | null | ### System Info
I run the tests `RUN_SLOW=1 pytest tests/models/emu3/test_modeling_emu3.py::Emu3IntegrationTest::tes
t_model_generate_images` on A100.
The groud truth image is like:

In the latest main branch. The output images are very different
4bit output image:

fp32 output image:

Before this [commit](https://github.com/huggingface/transformers/pull/37033)
4bit output image:

fp32 output image

We can see that the 4bit output is the same as the ground truth before the regression PR. After the regression PR, the output is significantly different.
Hi @SunMarc, could you confirm if something is wrong with emu3 or if we just need to update the test with the correct ground truth?
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
run the tests `RUN_SLOW=1 pytest tests/models/emu3/test_modeling_emu3.py::Emu3IntegrationTest::tes
t_model_generate_images` on A100.
### Expected behavior
The test should pass. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38121/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38121/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38120 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38120/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38120/comments | https://api.github.com/repos/huggingface/transformers/issues/38120/events | https://github.com/huggingface/transformers/issues/38120 | 3,062,091,561 | I_kwDOCUB6oc62g88p | 38,120 | phi-4-mm HF format | {
"login": "lifuhuang",
"id": 3427331,
"node_id": "MDQ6VXNlcjM0MjczMzE=",
"avatar_url": "https://avatars.githubusercontent.com/u/3427331?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lifuhuang",
"html_url": "https://github.com/lifuhuang",
"followers_url": "https://api.github.com/users/lifuhuang/followers",
"following_url": "https://api.github.com/users/lifuhuang/following{/other_user}",
"gists_url": "https://api.github.com/users/lifuhuang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lifuhuang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lifuhuang/subscriptions",
"organizations_url": "https://api.github.com/users/lifuhuang/orgs",
"repos_url": "https://api.github.com/users/lifuhuang/repos",
"events_url": "https://api.github.com/users/lifuhuang/events{/privacy}",
"received_events_url": "https://api.github.com/users/lifuhuang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-05-14T07:19:41 | 2025-05-29T07:09:39 | 2025-05-29T07:09:39 | NONE | null | null | null | null | ### Model description
I noticed from https://github.com/huggingface/transformers/issues/37849#issuecomment-2837928929 that huggingface plans to convert phi4 MM model to HF format. However, I noticed the PR has been inactive for a while: https://huggingface.co/microsoft/Phi-4-multimodal-instruct/discussions/70. I wonder if you this is still in HF plan? If so, when it would be supported? Thanks!
(cc @zucchini-nlp)
### Open source status
- [ ] The model implementation is available
- [ ] The model weights are available
### Provide useful links for the implementation
_No response_ | {
"login": "lifuhuang",
"id": 3427331,
"node_id": "MDQ6VXNlcjM0MjczMzE=",
"avatar_url": "https://avatars.githubusercontent.com/u/3427331?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lifuhuang",
"html_url": "https://github.com/lifuhuang",
"followers_url": "https://api.github.com/users/lifuhuang/followers",
"following_url": "https://api.github.com/users/lifuhuang/following{/other_user}",
"gists_url": "https://api.github.com/users/lifuhuang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lifuhuang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lifuhuang/subscriptions",
"organizations_url": "https://api.github.com/users/lifuhuang/orgs",
"repos_url": "https://api.github.com/users/lifuhuang/repos",
"events_url": "https://api.github.com/users/lifuhuang/events{/privacy}",
"received_events_url": "https://api.github.com/users/lifuhuang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38120/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38120/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38119 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38119/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38119/comments | https://api.github.com/repos/huggingface/transformers/issues/38119/events | https://github.com/huggingface/transformers/issues/38119 | 3,061,857,038 | I_kwDOCUB6oc62gDsO | 38,119 | Unable to quantize a pretrained SegFormer-B0 to int8 using Quanto | {
"login": "shubham-beri",
"id": 29382306,
"node_id": "MDQ6VXNlcjI5MzgyMzA2",
"avatar_url": "https://avatars.githubusercontent.com/u/29382306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shubham-beri",
"html_url": "https://github.com/shubham-beri",
"followers_url": "https://api.github.com/users/shubham-beri/followers",
"following_url": "https://api.github.com/users/shubham-beri/following{/other_user}",
"gists_url": "https://api.github.com/users/shubham-beri/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shubham-beri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shubham-beri/subscriptions",
"organizations_url": "https://api.github.com/users/shubham-beri/orgs",
"repos_url": "https://api.github.com/users/shubham-beri/repos",
"events_url": "https://api.github.com/users/shubham-beri/events{/privacy}",
"received_events_url": "https://api.github.com/users/shubham-beri/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-14T05:27:27 | 2025-06-22T08:02:40 | 2025-06-22T08:02:40 | NONE | null | null | null | null | Does it support Conv2d? Here's my complete script:
```
import os
import argparse
import torch
from torch.utils.data import Dataset, DataLoader
from torchvision import transforms
from PIL import Image
from transformers import SegformerConfig, SegformerForSemanticSegmentation
from optimum.quanto import quantize, qint8, Calibration, freeze
class PngCalibrationDataset(Dataset):
"""Loads all .png files in a directory as single-channel tensors."""
def __init__(self, image_dir: str, transform=None):
self.image_paths = [
os.path.join(image_dir, fname)
for fname in sorted(os.listdir(image_dir))
if fname.lower().endswith(".png")
]
self.transform = transform
def __len__(self):
return len(self.image_paths)
def __getitem__(self, idx):
path = self.image_paths[idx]
img = Image.open(path).convert("L") # grayscale
if self.transform:
img = self.transform(img)
return img
def build_calib_loader(image_dir: str, batch_size: int):
"""Creates a DataLoader for calibration images."""
transform = transforms.Compose([
transforms.Resize((288, 512)), # (height, width)
transforms.ToTensor(), # → shape [1, 288, 512]
# If your training used normalized inputs, uncomment and set mean/std:
# transforms.Normalize(mean=[0.5], std=[0.5]),
])
dataset = PngCalibrationDataset(image_dir, transform=transform)
return DataLoader(dataset, batch_size=batch_size, shuffle=True)
def load_and_prepare_model(
checkpoint_path: str,
num_input_channels: int = 1,
num_labels: int = 2
) -> torch.nn.Module:
"""Loads a SegFormer model and populates it with your pretrained weights."""
# Base config & model
config = SegformerConfig.from_pretrained(
"nvidia/segformer-b0-finetuned-ade-512-512"
)
config.num_channels = num_input_channels
config.num_labels = num_labels
model = SegformerForSemanticSegmentation(config)
# Load checkpoint
ckpt = torch.load(checkpoint_path, map_location="cpu")
state_dict = ckpt.get("state_dict", ckpt.get("model", ckpt))
cleaned = {}
# for k, v in state_dict.items():
# name = k.replace("module.", "").replace("model.", "")
# if name in model.state_dict() and v.size() == model.state_dict()[name].size():
# cleaned[name] = v
model.load_state_dict(state_dict, strict=False)
return model
def main():
parser = argparse.ArgumentParser(
description="Quantize SegFormer to int8 (weights+activations) with calibration"
)
parser.add_argument(
"--repdata", "-r", type=str, required=True,
help="Directory of PNGs for calibration"
)
parser.add_argument(
"--checkpoint", "-c", type=str, default="best_model_SegFormerB0.pth",
help="Path to your pretrained .pth checkpoint"
)
parser.add_argument(
"--batch_size", "-b", type=int, default=1,
help="Batch size for calibration loader"
)
args = parser.parse_args()
# 1) Device setup
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
print(f"Using device: {device}")
# 2) Build calibration DataLoader
calib_loader = build_calib_loader(args.repdata, batch_size=args.batch_size)
# 3) Load model & weights
model = load_and_prepare_model(args.checkpoint)
model.to(device)
model.eval()
# 4) Insert dynamic quantization (weights + activation stubs)
quantize(model, weights=qint8, activations=qint8)
# 5) Calibrate activation ranges
print("Calibrating activation ranges...")
with Calibration(momentum=0.9):
batch_count = 0
for batch in calib_loader:
print(f"Calibrating batch {batch_count + 1}/{len(calib_loader)}")
# batch: [B, 1, 288, 512]
batch = batch.to(device)
_ = model(batch)
batch_count += 1
# 6) Freeze quantized weights into actual int8 tensors
freeze(model)
from torch.nn.quantized import Linear, Conv2d
# 7) Verify parameter dtypes
print("Parameter dtypes after quantization & freezing:")
for name, mod in model.named_modules():
if isinstance(mod, (Linear, Conv2d)):
# packed int-8 weights live here
print(f"{name:55s} → {mod.weight().dtype}")
if __name__ == "__main__":
main()
``` | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38119/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38119/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38118 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38118/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38118/comments | https://api.github.com/repos/huggingface/transformers/issues/38118/events | https://github.com/huggingface/transformers/issues/38118 | 3,061,739,840 | I_kwDOCUB6oc62fnFA | 38,118 | Llama4 inference encounter unsupported op in dynamo ? | {
"login": "HuangChiEn",
"id": 52521165,
"node_id": "MDQ6VXNlcjUyNTIxMTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/52521165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HuangChiEn",
"html_url": "https://github.com/HuangChiEn",
"followers_url": "https://api.github.com/users/HuangChiEn/followers",
"following_url": "https://api.github.com/users/HuangChiEn/following{/other_user}",
"gists_url": "https://api.github.com/users/HuangChiEn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HuangChiEn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HuangChiEn/subscriptions",
"organizations_url": "https://api.github.com/users/HuangChiEn/orgs",
"repos_url": "https://api.github.com/users/HuangChiEn/repos",
"events_url": "https://api.github.com/users/HuangChiEn/events{/privacy}",
"received_events_url": "https://api.github.com/users/HuangChiEn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-14T04:00:01 | 2025-08-06T00:06:14 | 2025-08-06T00:06:14 | NONE | null | null | null | null | ### System Info
transformers==4.51.2
torch==2.5.0
torchvision==0.20.0
Miscellnous pkg version
Package Version
------------------------- -----------
accelerate 1.6.0
aiofiles 23.2.1
aiohappyeyeballs 2.6.1
aiohttp 3.11.16
aiosignal 1.3.2
altair 5.5.0
annotated-types 0.7.0
antlr4-python3-runtime 4.9.3
anyio 4.9.0
async-timeout 5.0.1
attrs 25.3.0
bitsandbytes 0.45.4
cachetools 5.5.2
certifi 2025.1.31
charset-normalizer 3.4.1
click 8.1.8
contourpy 1.3.1
cycler 0.12.1
datasets 3.5.0
decord 0.6.0
deepspeed 0.14.4
descartes 1.1.0
dill 0.3.8
distro 1.9.0
docker-pycreds 0.4.0
einops 0.6.1
einops-exts 0.0.4
et_xmlfile 2.0.0
exceptiongroup 1.2.2
fastapi 0.115.12
ffmpy 0.5.0
filelock 3.18.0
fire 0.7.0
flash-attn 2.7.3
fonttools 4.56.0
frozenlist 1.5.0
fsspec 2024.12.0
gitdb 4.0.12
GitPython 3.1.44
gradio_client 0.8.1
h11 0.14.0
hf-xet 1.0.3
hjson 3.1.0
httpcore 0.17.3
httpx 0.24.0
huggingface-hub 0.30.2
idna 3.10
imageio 2.37.0
importlib_resources 6.5.2
inquirerpy 0.3.4
Jinja2 3.1.6
jiter 0.9.0
joblib 1.4.2
jsonschema 4.23.0
jsonschema-specifications 2024.10.1
kagglehub 0.3.11
kiwisolver 1.4.8
latex2mathml 3.77.0
markdown-it-py 3.0.0
markdown2 2.5.3
MarkupSafe 3.0.2
matplotlib 3.5.3
mdurl 0.1.2
mpmath 1.3.0
multidict 6.4.3
multiprocess 0.70.16
narwhals 1.32.0
networkx 3.4.2
ninja 1.11.1.4
numpy 1.26.4
nuscenes-devkit 1.1.11
nvidia-cublas-cu12 12.4.5.8
nvidia-cuda-cupti-cu12 12.4.127
nvidia-cuda-nvrtc-cu12 12.4.127
nvidia-cuda-runtime-cu12 12.4.127
nvidia-cudnn-cu12 9.1.0.70
nvidia-cufft-cu12 11.2.1.3
nvidia-curand-cu12 10.3.5.147
nvidia-cusolver-cu12 11.6.1.9
nvidia-cusparse-cu12 12.3.1.170
nvidia-ml-py 12.570.86
nvidia-nccl-cu12 2.21.5
nvidia-nvjitlink-cu12 12.4.127
nvidia-nvtx-cu12 12.4.127
omegaconf 2.3.0
openai 1.74.0
opencv-python 4.11.0.86
opencv-python-headless 4.11.0.86
openpyxl 3.1.5
orjson 3.10.16
packaging 24.2
pandas 2.2.3
peft 0.10.0
pfzy 0.3.4
pillow 11.1.0
pip 25.0
platformdirs 4.3.7
portalocker 3.1.1
prompt_toolkit 3.0.51
propcache 0.3.1
protobuf 5.29.4
psutil 7.0.0
py-cpuinfo 9.0.0
pyarrow 19.0.1
pycocotools 2.0.8
pydantic 2.10.6
pydantic_core 2.27.2
pydub 0.25.1
Pygments 2.19.1
pynvml 12.0.0
pyparsing 3.2.3
pyquaternion 0.9.9
python-dateutil 2.9.0.post0
python-dotenv 1.1.0
python-multipart 0.0.20
pytz 2025.2
PyYAML 6.0.2
referencing 0.36.2
regex 2024.11.6
requests 2.32.3
rich 13.9.4
rpds-py 0.24.0
ruff 0.11.2
safetensors 0.5.3
scikit-learn 1.2.2
scipy 1.15.2
semantic-version 2.10.0
sentencepiece 0.1.99
sentry-sdk 2.24.1
setproctitle 1.3.5
setuptools 75.8.0
Shapely 1.8.5.post1
shellingham 1.5.4
shortuuid 1.0.13
six 1.17.0
smmap 5.0.2
sniffio 1.3.1
starlette 0.46.1
sty 1.0.6
svgwrite 1.4.3
sympy 1.13.1
tabulate 0.9.0
termcolor 3.0.1
threadpoolctl 3.6.0
tiktoken 0.8.0
timeout-decorator 0.5.0
timm 0.6.13
tokenizers 0.21.1
tomlkit 0.12.0
torch 2.5.0
torchvision 0.20.0
tqdm 4.67.1
transformers 4.51.2
triton 3.1.0
trl 0.16.1
typer 0.15.2
typing_extensions 4.12.2
tzdata 2025.2
urllib3 2.3.0
uvicorn 0.34.0
validators 0.34.0
wandb 0.19.8
wavedrom 2.0.3.post3
wcwidth 0.2.13
websockets 11.0.3
wheel 0.45.1
XlsxWriter 3.2.2
xxhash 3.5.0
yarl 1.19.0
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
from transformers import AutoProcessor, Llama4ForConditionalGeneration
import torch
import torch._dynamo
torch._dynamo.config.suppress_errors = True
torch._dynamo.config.capture_scalar_outputs = False
def get_modules():
model_id = "/data/joseph/.cache/hub/models--meta-llama--Llama-4-Scout-17B-16E-Instruct/snapshots/7dab2f5f854fe665b6b2f1eccbd3c48e5f627ad8"#"meta-llama/Llama-4-Scout-17B-16E-Instruct"
# official setup args
processor = AutoProcessor.from_pretrained(model_id)
model = Llama4ForConditionalGeneration.from_pretrained(
model_id,
attn_implementation="flex_attention",
device_map="auto",
torch_dtype=torch.bfloat16,
)
return model, processor
if __name__ == '__main__':
from huggingface_hub import login
import os
# login to HF to load llama4
#login(os.environ.get("HF_TOKEN"))
model, processor = get_modules()
url1 = "/data/joseph/llama4_inference/playground/rabbit.jpg"
url2 = "/data/joseph/llama4_inference/playground/cat_style_layout.png"
messages = [
{
"role": "user",
"content": [
{"type": "image", "path": url1},
{"type": "image", "path": url2},
{"type": "text", "text": "Can you describe how these two images are similar, and how they differ?"},
]
},
]
inputs = processor.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device)
# bug happens in here!
outputs = model.generate(
**inputs,
max_new_tokens=256,
)
response = processor.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])[0]
print(response)
print(outputs[0])
```
For the dynamo config, i had tried several combinations, but no one works :
1. `torch._dynamo.config.suppress_errors = True` and `torch._dynamo.config.capture_scalar_outputs = False`

2. Both True (this one seems raise the error in very low level...orz)

### Expected behavior
the official code snippet can run... | {
"login": "HuangChiEn",
"id": 52521165,
"node_id": "MDQ6VXNlcjUyNTIxMTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/52521165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HuangChiEn",
"html_url": "https://github.com/HuangChiEn",
"followers_url": "https://api.github.com/users/HuangChiEn/followers",
"following_url": "https://api.github.com/users/HuangChiEn/following{/other_user}",
"gists_url": "https://api.github.com/users/HuangChiEn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HuangChiEn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HuangChiEn/subscriptions",
"organizations_url": "https://api.github.com/users/HuangChiEn/orgs",
"repos_url": "https://api.github.com/users/HuangChiEn/repos",
"events_url": "https://api.github.com/users/HuangChiEn/events{/privacy}",
"received_events_url": "https://api.github.com/users/HuangChiEn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38118/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38117 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38117/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38117/comments | https://api.github.com/repos/huggingface/transformers/issues/38117/events | https://github.com/huggingface/transformers/issues/38117 | 3,061,470,592 | I_kwDOCUB6oc62elWA | 38,117 | OLMo and OLMo 2 models do not match original models for low precisions | {
"login": "2015aroras",
"id": 19700980,
"node_id": "MDQ6VXNlcjE5NzAwOTgw",
"avatar_url": "https://avatars.githubusercontent.com/u/19700980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/2015aroras",
"html_url": "https://github.com/2015aroras",
"followers_url": "https://api.github.com/users/2015aroras/followers",
"following_url": "https://api.github.com/users/2015aroras/following{/other_user}",
"gists_url": "https://api.github.com/users/2015aroras/gists{/gist_id}",
"starred_url": "https://api.github.com/users/2015aroras/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/2015aroras/subscriptions",
"organizations_url": "https://api.github.com/users/2015aroras/orgs",
"repos_url": "https://api.github.com/users/2015aroras/repos",
"events_url": "https://api.github.com/users/2015aroras/events{/privacy}",
"received_events_url": "https://api.github.com/users/2015aroras/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-14T00:04:27 | 2025-05-19T13:35:24 | 2025-05-19T13:35:24 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.52.0.dev0
- Platform: macOS-14.5-arm64-arm-64bit
- Python version: 3.12.9
- Huggingface_hub version: 0.31.2
- Safetensors version: 0.5.2
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Steps to reproduce behavior:
1. Obtain a copy of an OLMo or OLMo 2 model and its corresponding HF model.
2. Run the original and HF model in a lower precision like float16.
Giving code for a repro is particularly complicated due to running the models in the original codebase, but I intend to fix this bug myself so hopefully it is fine to omit this.
### Expected behavior
Expected behavior is that the output logits match or are close (say, all within 1e-4). In practice, this is not the case for bfloat16 and float16.
```
...
[2025-05-13 16:54:03] INFO [__main__:246, rank=0] blocks.11.feed_forward_norm|output, model.layers.11.post_feedforward_layernorm|output element diff abs mean: 0.0077401818707585335
[2025-05-13 16:54:03] INFO [__main__:246, rank=0] lm_head.norm|input, model.norm|input element diff abs mean: 0.015348772518336773
[2025-05-13 16:54:03] INFO [__main__:246, rank=0] lm_head.norm|output, model.norm|output element diff abs mean: 0.01983717642724514
[2025-05-13 16:54:03] INFO [__main__:246, rank=0] lm_head.w_out|input, lm_head|input element diff abs mean: 0.01983717642724514
[2025-05-13 16:54:03] INFO [__main__:234, rank=0] lm_head.w_out|output, lm_head|output shape mismatch: torch.Size([1, 120, 100352]) torch.Size([1, 120, 100278])
[2025-05-13 16:54:03] INFO [__main__:246, rank=0] lm_head.w_out|output, lm_head|output element diff abs mean: 0.03660527244210243
``` | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38117/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38117/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38116 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38116/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38116/comments | https://api.github.com/repos/huggingface/transformers/issues/38116/events | https://github.com/huggingface/transformers/issues/38116 | 3,061,407,094 | I_kwDOCUB6oc62eV12 | 38,116 | [Bug in Generate] 4.51.2 vs 4.46 Beam search results are sometimes different, not sure if beam search or T5 model change is the reason? | {
"login": "Oxi84",
"id": 25420033,
"node_id": "MDQ6VXNlcjI1NDIwMDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/25420033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Oxi84",
"html_url": "https://github.com/Oxi84",
"followers_url": "https://api.github.com/users/Oxi84/followers",
"following_url": "https://api.github.com/users/Oxi84/following{/other_user}",
"gists_url": "https://api.github.com/users/Oxi84/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Oxi84/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Oxi84/subscriptions",
"organizations_url": "https://api.github.com/users/Oxi84/orgs",
"repos_url": "https://api.github.com/users/Oxi84/repos",
"events_url": "https://api.github.com/users/Oxi84/events{/privacy}",
"received_events_url": "https://api.github.com/users/Oxi84/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-13T23:01:12 | 2025-06-24T08:02:43 | 2025-06-24T08:02:43 | NONE | null | null | null | null | I get different results in version 4.51.2 compared to 4.46. Diverse beam works well, normal beam search does not, it sometimes just generates all the same sequences when generating multiple beams (4-8).
Some small bug just generate first 2 beams ok, and after just repeats the second one. It happens in around 5 percent of input senences but when it does it gives 2 instead of 8 different versions of the text which is pretty bad.
Or there is something different about loading the T5 model, i remember there was a problem about loading in .half vs when you mark dtype as fp16?
In around 10 percent of cases i get different results from 4.46 via 4.51. Inputs are exactly the same, i checked tokenized version id and they are the same.
Model is t5 and the params are just basic, beam search, num_beams and num_returned no other params and does not match.
Thanks!
### System Info
Ubuntu transformers, cuda
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Any t5 model basic usage:
for example model:
model_id=prithivida/parrot_paraphraser_on_T5
beam_outputs = model.generate(
input_ids=input_ids, attention_mask=attention_masks,
do_sample=False,
num_beams=num_beams,
max_length=max_len,
num_return_sequences=num_beams,
)
### Expected behavior
should show the same | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38116/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38115 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38115/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38115/comments | https://api.github.com/repos/huggingface/transformers/issues/38115/events | https://github.com/huggingface/transformers/issues/38115 | 3,061,345,978 | I_kwDOCUB6oc62eG66 | 38,115 | support static kv cache with torch.compile for qwen2vl | {
"login": "ChuyaoShen",
"id": 73958412,
"node_id": "MDQ6VXNlcjczOTU4NDEy",
"avatar_url": "https://avatars.githubusercontent.com/u/73958412?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChuyaoShen",
"html_url": "https://github.com/ChuyaoShen",
"followers_url": "https://api.github.com/users/ChuyaoShen/followers",
"following_url": "https://api.github.com/users/ChuyaoShen/following{/other_user}",
"gists_url": "https://api.github.com/users/ChuyaoShen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChuyaoShen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChuyaoShen/subscriptions",
"organizations_url": "https://api.github.com/users/ChuyaoShen/orgs",
"repos_url": "https://api.github.com/users/ChuyaoShen/repos",
"events_url": "https://api.github.com/users/ChuyaoShen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChuyaoShen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | [] | 2025-05-13T22:10:20 | 2025-05-21T09:50:41 | 2025-05-21T09:50:41 | NONE | null | null | null | null | ### Feature request
It is claimed that qwen2-vl supports static KV cache and `torch.compile` in this issue: https://github.com/huggingface/transformers/issues/28981. However, this is not true: the `_supports_static_cache` attribute is disabled in `modeling_qwen2_vl.py` (see: https://github.com/huggingface/transformers/blob/b311a3f50697c9602cc5d13a5faf7f6059c392ca/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py#L927).
Is there a plan to fix it and enable static KV cache for qwen2-vl?
### Motivation
`qwen2_vl` is a widely used open source VLM. And we should have boosted inference speed when static KV cache and `torch.compile` (up to 4x according to https://huggingface.co/docs/transformers/main/en/llm_optims#static-kv-cache-and-torchcompile)
### Your contribution
provide examples | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38115/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/38115/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38114 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38114/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38114/comments | https://api.github.com/repos/huggingface/transformers/issues/38114/events | https://github.com/huggingface/transformers/pull/38114 | 3,061,161,121 | PR_kwDOCUB6oc6WD2rQ | 38,114 | Force real tensors and clone state_dict in src/transformers/modeling_utils.py | {
"login": "MutugiD",
"id": 61890059,
"node_id": "MDQ6VXNlcjYxODkwMDU5",
"avatar_url": "https://avatars.githubusercontent.com/u/61890059?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MutugiD",
"html_url": "https://github.com/MutugiD",
"followers_url": "https://api.github.com/users/MutugiD/followers",
"following_url": "https://api.github.com/users/MutugiD/following{/other_user}",
"gists_url": "https://api.github.com/users/MutugiD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MutugiD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MutugiD/subscriptions",
"organizations_url": "https://api.github.com/users/MutugiD/orgs",
"repos_url": "https://api.github.com/users/MutugiD/repos",
"events_url": "https://api.github.com/users/MutugiD/events{/privacy}",
"received_events_url": "https://api.github.com/users/MutugiD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2934977194,
"node_id": "MDU6TGFiZWwyOTM0OTc3MTk0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flax",
"name": "Flax",
"color": "4862AD",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-05-13T20:31:09 | 2025-07-15T09:55:55 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38114",
"html_url": "https://github.com/huggingface/transformers/pull/38114",
"diff_url": "https://github.com/huggingface/transformers/pull/38114.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38114.patch",
"merged_at": null
} | What does this PR do?
This PR changes the default behavior of Flax→PyTorch conversion to always allocate real CPU tensors (by setting ```low_cpu_mem_usage=False``` in ```from_pretrained)``` and replaces the old offload logic in ```save_pretrained``` with an up-front detach-and-clone of the entire state_dict. This guarantees no meta-tensor placeholders or shared-storage checks, so converted ViT models now save and reload without errors.
Fixes #37999
Who can review?
Flax integration: @gante
PyTorch save/load logic: @Rocketknight1 | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38114/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38114/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38113 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38113/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38113/comments | https://api.github.com/repos/huggingface/transformers/issues/38113/events | https://github.com/huggingface/transformers/pull/38113 | 3,061,007,168 | PR_kwDOCUB6oc6WDWdS | 38,113 | Update trainer.md | {
"login": "guspuffygit",
"id": 85897841,
"node_id": "MDQ6VXNlcjg1ODk3ODQx",
"avatar_url": "https://avatars.githubusercontent.com/u/85897841?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guspuffygit",
"html_url": "https://github.com/guspuffygit",
"followers_url": "https://api.github.com/users/guspuffygit/followers",
"following_url": "https://api.github.com/users/guspuffygit/following{/other_user}",
"gists_url": "https://api.github.com/users/guspuffygit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guspuffygit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guspuffygit/subscriptions",
"organizations_url": "https://api.github.com/users/guspuffygit/orgs",
"repos_url": "https://api.github.com/users/guspuffygit/repos",
"events_url": "https://api.github.com/users/guspuffygit/events{/privacy}",
"received_events_url": "https://api.github.com/users/guspuffygit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T19:22:09 | 2025-05-14T13:33:00 | 2025-05-14T12:40:01 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38113",
"html_url": "https://github.com/huggingface/transformers/pull/38113",
"diff_url": "https://github.com/huggingface/transformers/pull/38113.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38113.patch",
"merged_at": "2025-05-14T12:40:01"
} | Fix typo in torch.compile method parameters
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38113/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38113/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38112 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38112/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38112/comments | https://api.github.com/repos/huggingface/transformers/issues/38112/events | https://github.com/huggingface/transformers/pull/38112 | 3,060,690,792 | PR_kwDOCUB6oc6WCR8d | 38,112 | Add CausalLM support for ModernBert | {
"login": "yijun-lee",
"id": 119404328,
"node_id": "U_kgDOBx33KA",
"avatar_url": "https://avatars.githubusercontent.com/u/119404328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yijun-lee",
"html_url": "https://github.com/yijun-lee",
"followers_url": "https://api.github.com/users/yijun-lee/followers",
"following_url": "https://api.github.com/users/yijun-lee/following{/other_user}",
"gists_url": "https://api.github.com/users/yijun-lee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yijun-lee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yijun-lee/subscriptions",
"organizations_url": "https://api.github.com/users/yijun-lee/orgs",
"repos_url": "https://api.github.com/users/yijun-lee/repos",
"events_url": "https://api.github.com/users/yijun-lee/events{/privacy}",
"received_events_url": "https://api.github.com/users/yijun-lee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T16:58:24 | 2025-05-14T12:52:21 | 2025-05-14T12:52:21 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38112",
"html_url": "https://github.com/huggingface/transformers/pull/38112",
"diff_url": "https://github.com/huggingface/transformers/pull/38112.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38112.patch",
"merged_at": null
} | # What does this PR do?
Added CausalLM support for ModernBert as part of #35385. If there are no issues, I plan to follow up with a commit enabling its use in EncoderDecoderModel.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #35385
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "yijun-lee",
"id": 119404328,
"node_id": "U_kgDOBx33KA",
"avatar_url": "https://avatars.githubusercontent.com/u/119404328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yijun-lee",
"html_url": "https://github.com/yijun-lee",
"followers_url": "https://api.github.com/users/yijun-lee/followers",
"following_url": "https://api.github.com/users/yijun-lee/following{/other_user}",
"gists_url": "https://api.github.com/users/yijun-lee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yijun-lee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yijun-lee/subscriptions",
"organizations_url": "https://api.github.com/users/yijun-lee/orgs",
"repos_url": "https://api.github.com/users/yijun-lee/repos",
"events_url": "https://api.github.com/users/yijun-lee/events{/privacy}",
"received_events_url": "https://api.github.com/users/yijun-lee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38112/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38112/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38111 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38111/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38111/comments | https://api.github.com/repos/huggingface/transformers/issues/38111/events | https://github.com/huggingface/transformers/pull/38111 | 3,060,664,888 | PR_kwDOCUB6oc6WCMWw | 38,111 | Support TP for save_pretrained() | {
"login": "amd-xiaoyu12",
"id": 188109516,
"node_id": "U_kgDOCzZSzA",
"avatar_url": "https://avatars.githubusercontent.com/u/188109516?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amd-xiaoyu12",
"html_url": "https://github.com/amd-xiaoyu12",
"followers_url": "https://api.github.com/users/amd-xiaoyu12/followers",
"following_url": "https://api.github.com/users/amd-xiaoyu12/following{/other_user}",
"gists_url": "https://api.github.com/users/amd-xiaoyu12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amd-xiaoyu12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amd-xiaoyu12/subscriptions",
"organizations_url": "https://api.github.com/users/amd-xiaoyu12/orgs",
"repos_url": "https://api.github.com/users/amd-xiaoyu12/repos",
"events_url": "https://api.github.com/users/amd-xiaoyu12/events{/privacy}",
"received_events_url": "https://api.github.com/users/amd-xiaoyu12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-13T16:46:17 | 2025-05-15T07:33:26 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38111",
"html_url": "https://github.com/huggingface/transformers/pull/38111",
"diff_url": "https://github.com/huggingface/transformers/pull/38111.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38111.patch",
"merged_at": null
} | # What does this PR do?
When a model is quantized using TP, the save_pretrained() need to support TP when quantization is done and export to model to safetensor file.
The modification will combine the DTensor to full tensor in process which rank = 0 and export the result, for process with other rank number, the exporting task will be skipped.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38111/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38110 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38110/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38110/comments | https://api.github.com/repos/huggingface/transformers/issues/38110/events | https://github.com/huggingface/transformers/pull/38110 | 3,060,512,632 | PR_kwDOCUB6oc6WBrG7 | 38,110 | [CSM] update test for t4 runners | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T15:44:40 | 2025-05-13T15:59:26 | 2025-05-13T15:59:26 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38110",
"html_url": "https://github.com/huggingface/transformers/pull/38110",
"diff_url": "https://github.com/huggingface/transformers/pull/38110.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38110.patch",
"merged_at": "2025-05-13T15:59:26"
} | This updates the tests for our CI T4 runners.
I obtained the values using the original code reproducer scripts (see test descriptions), which I ran on T4 runners.
cc @ydshieh 😊
| {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38110/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38110/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38109 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38109/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38109/comments | https://api.github.com/repos/huggingface/transformers/issues/38109/events | https://github.com/huggingface/transformers/pull/38109 | 3,060,493,836 | PR_kwDOCUB6oc6WBnCf | 38,109 | try | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T15:37:46 | 2025-05-13T17:27:23 | 2025-05-13T17:27:23 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38109",
"html_url": "https://github.com/huggingface/transformers/pull/38109",
"diff_url": "https://github.com/huggingface/transformers/pull/38109.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38109.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38109/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38109/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38108 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38108/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38108/comments | https://api.github.com/repos/huggingface/transformers/issues/38108/events | https://github.com/huggingface/transformers/pull/38108 | 3,060,411,315 | PR_kwDOCUB6oc6WBVBg | 38,108 | 🔴🔴🔴 [`Attention`] Refactor Attention Interface for Bart-based Models | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T15:10:28 | 2025-05-22T15:13:01 | 2025-05-22T15:12:59 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38108",
"html_url": "https://github.com/huggingface/transformers/pull/38108",
"diff_url": "https://github.com/huggingface/transformers/pull/38108.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38108.patch",
"merged_at": "2025-05-22T15:12:58"
} | This PR is gonna tackle two things in general:
- [x] Flex Attention for all base attention types (encoder, decoder, encoder-decoder cross)
- [x] New Attention Interface for a bunch of models (mostly based on Bart's implementation)
- [x] As a bonus, some models have already been refactored into modular if seen fit, e.g. PL Bart.
Affected models (will be updated when I have enough time) - probably not 100% accurate
- [x] Bart
- [x] Biogpt
- [x] Mbart
- [x] Bigbird Pegasus
- [x] Blenderbot
- [x] Blenderbot Small
- [x] Wav2vec
- [x] Data2vec audio
- [x] Informer
- [x] Timeseries Transformer
- [x] Hubert
- [x] M2M 100
- [x] Marian
- [x] Musicgen
- [x] Music Melody
- [x] Nllb MoE
- [x] Patchts Mixer
- [x] Patchtst
- [x] Pegasus
- [x] Pegasus X
- [x] PL Bart
- [x] Sew
- [x] Speech to Text
- [x] Uni Speech
- [x] Uni Speech SAT
Possibly doable in this PR:
- [ ] Rename flex attn mask creation since the focus is currently only on decoder-only ones, the naming is not really fitting
Worth a discussion (?):
- Time Series Transformer and Speech to Text theoretically have the new attentions but the tests are unsuitable (as they generate inputs with ids). Imo, a rewrite/adjustment is too much effort. I disabled it for now but we could also enable them and give the users a warning (untested).
- Pegasus X has more flaky logits on sdpa - should it still be enabled?
- Nllb MoE has issues with flash attention as masks are prepared differently and used by the MoE leading to several issues + sdpa has also more flaky logits --> I think it's not worth it based on the size of the model + the usage.
Worth a discussion TL;DR:
- Timer Series Transformer + Speech to Text do not work with the current testing framework (need special attention to what they get as input)
- Pegasus X has more flaky logits on sdpa - should it still be enabled?
- Nllb MoE is too complicated to enable fastly for Flash Attention (attention mask <-> MoE interaction) + same as pegasus x on sdpa logits.
Future PRs will address:
- [ ] Other models such as Bert-based models.
- [ ] Other models such as Whisper-based models.
- [ ] More modular.
- [ ] Proper kwargs passing --> multiple attn types should get different kwargs?
- [ ] In combination with above point ^ optionally tp plans etc for attn backend support?
- [ ] Fa might be able to work without position ids but fa_kwargs only - this is a current limitation in the fa modeling utils...
- [ ] Mask refactor as iirc the prepare_for... are not really neat. However, the way it is written, refactoring should be easy ;)
- [ ] Flex Attn tests and fixes (compile issues)
- [ ] Better datasets versioning? Pipeline issues esp with audio | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38108/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38108/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38107 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38107/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38107/comments | https://api.github.com/repos/huggingface/transformers/issues/38107/events | https://github.com/huggingface/transformers/pull/38107 | 3,060,328,466 | PR_kwDOCUB6oc6WBDHW | 38,107 | fix `check_bad commit.py` gives wrong results | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T14:42:24 | 2025-05-13T14:59:11 | 2025-05-13T14:58:22 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38107",
"html_url": "https://github.com/huggingface/transformers/pull/38107",
"diff_url": "https://github.com/huggingface/transformers/pull/38107.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38107.patch",
"merged_at": "2025-05-13T14:58:22"
} | # What does this PR do?
During bisecting, it uses
> elif f"{target_test} FAILED" in result.stdout:
to determine a commit fails a test or not.
However, since #35912, the output is kind altered with `live log call`, and the results are wrong (it think every commit passes).
This PR fixes the issue by checking
> elif f"FAILED {target_test}" in result.stdout:
(at `short test summary info`)
#### current output running on main
> tests/models/instructblipvideo/test_modeling_instructblipvideo.py::InstructBlipVideoModelIntegrationTest::test_expansion_in_processing
-------------------------------- live log call ---------------------------------
WARNING transformers.video_processing_utils:logging.py:[32](https://github.com/huggingface/transformers/actions/runs/14999267405/job/42141539393#step:5:33)8 You have video processor config saved in `preprocessor.json` file which is deprecated. Video processor configs should be saved in their own `video_preprocessor.json` file. You can rename the file or load and save the processor back which renames it automatically. Loading from `preprocessor.json` will be removed in v5.0.
FAILED | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38107/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38106 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38106/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38106/comments | https://api.github.com/repos/huggingface/transformers/issues/38106/events | https://github.com/huggingface/transformers/pull/38106 | 3,060,246,316 | PR_kwDOCUB6oc6WAxRo | 38,106 | More typing in src/transformers/training_args.py | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T14:16:50 | 2025-07-17T13:46:13 | 2025-05-22T11:14:33 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38106",
"html_url": "https://github.com/huggingface/transformers/pull/38106",
"diff_url": "https://github.com/huggingface/transformers/pull/38106.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38106.patch",
"merged_at": "2025-05-22T11:14:33"
} | # What does this PR do?
More typing in `TrainingArguments` | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38106/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38106/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38105 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38105/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38105/comments | https://api.github.com/repos/huggingface/transformers/issues/38105/events | https://github.com/huggingface/transformers/pull/38105 | 3,060,226,529 | PR_kwDOCUB6oc6WAs_o | 38,105 | [video processors] support frame sampling within processors | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T14:10:18 | 2025-06-12T09:34:31 | 2025-06-12T09:34:30 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38105",
"html_url": "https://github.com/huggingface/transformers/pull/38105",
"diff_url": "https://github.com/huggingface/transformers/pull/38105.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38105.patch",
"merged_at": "2025-06-12T09:34:30"
} | # What does this PR do?
Now that we have video processors separate from image processors, the next step is to keep all video-specific processing there. This PR refactors how we sample video frames
Before:
- Frames can be sampled only with `apply_chat_template` and require one to pass a callable `sampling_fn` for model-specific cases. Cannot handle non-common kwargs.
- Users have to sample frames themselves if they prefer to not use chat templates
Now:
- Video sampling is the first step in `self.video_processor`. Each model can define their own logic and use all kwargs defined in video processing config.
- Users can pass a whole video and expect the processor to sample it the best way, as the model expects
- Chat template code cleaned up. Now it only loads the whole video/image/audio and formats the text. Everything else is done by respective processors
Note: For SmolVLM this is quite difficult to implement without breaking because the model never had a `video_token` and treated videos as sequence of images. To keep BC we would have to update chat template for all models on the hub which is not possible. So an ugly workaround is to keep default chat template in code
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38105/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38104 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38104/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38104/comments | https://api.github.com/repos/huggingface/transformers/issues/38104/events | https://github.com/huggingface/transformers/pull/38104 | 3,060,054,047 | PR_kwDOCUB6oc6WAHkY | 38,104 | [video processor] fix tests | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T13:16:15 | 2025-05-14T10:24:08 | 2025-05-14T10:24:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38104",
"html_url": "https://github.com/huggingface/transformers/pull/38104",
"diff_url": "https://github.com/huggingface/transformers/pull/38104.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38104.patch",
"merged_at": "2025-05-14T10:24:08"
} | # What does this PR do?
Fixes a few tests that started failing after https://github.com/huggingface/transformers/pull/35206 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38104/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38104/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38103 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38103/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38103/comments | https://api.github.com/repos/huggingface/transformers/issues/38103/events | https://github.com/huggingface/transformers/pull/38103 | 3,060,022,562 | PR_kwDOCUB6oc6WAAsJ | 38,103 | Fix incorrect batching audio index calculation for Phi-4-Multimodal | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T13:06:05 | 2025-05-26T13:41:02 | 2025-05-26T12:41:31 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38103",
"html_url": "https://github.com/huggingface/transformers/pull/38103",
"diff_url": "https://github.com/huggingface/transformers/pull/38103.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38103.patch",
"merged_at": "2025-05-26T12:41:31"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes [#38098](https://github.com/huggingface/transformers/issues/38098) (issue)
- ~~Just notice there is no feature extractor test for Phi-4-MM, let me add one for it, so mark this as draft currently.~~
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38103/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38103/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38102 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38102/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38102/comments | https://api.github.com/repos/huggingface/transformers/issues/38102/events | https://github.com/huggingface/transformers/pull/38102 | 3,060,005,585 | PR_kwDOCUB6oc6V_862 | 38,102 | Add style bot | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T13:01:30 | 2025-05-13T17:07:19 | 2025-05-13T17:07:18 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38102",
"html_url": "https://github.com/huggingface/transformers/pull/38102",
"diff_url": "https://github.com/huggingface/transformers/pull/38102.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38102.patch",
"merged_at": "2025-05-13T17:07:17"
} | # What does this PR do ?
This PR adds the style bot from `huggingface_hub` (same as in accelerate / diffusers)
To run, in a PR call @bot /style
cc @hanouticelina @Wauplin @ydshieh @ArthurZucker | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38102/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38102/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38101 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38101/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38101/comments | https://api.github.com/repos/huggingface/transformers/issues/38101/events | https://github.com/huggingface/transformers/pull/38101 | 3,059,897,088 | PR_kwDOCUB6oc6V_lA9 | 38,101 | disable deepspeed when setting up fake trainer | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T12:25:57 | 2025-05-15T13:34:05 | 2025-05-15T13:34:05 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38101",
"html_url": "https://github.com/huggingface/transformers/pull/38101",
"diff_url": "https://github.com/huggingface/transformers/pull/38101.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38101.patch",
"merged_at": "2025-05-15T13:34:05"
} | # What does this PR do?
When training with deepspeed and `WANDB_LOG_MODEL` is enabled, on train end, a fake trainer is created on the train end to create artifacts, but results in https://gist.github.com/winglian/e4640bb86860678ba2415cde54d986f4
This removes deepspeed from the args to workaround that issue.
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38101/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38100 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38100/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38100/comments | https://api.github.com/repos/huggingface/transformers/issues/38100/events | https://github.com/huggingface/transformers/pull/38100 | 3,059,722,986 | PR_kwDOCUB6oc6V--N9 | 38,100 | Fix amp deprecation issue | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T11:29:51 | 2025-06-02T14:15:43 | 2025-06-02T14:15:41 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38100",
"html_url": "https://github.com/huggingface/transformers/pull/38100",
"diff_url": "https://github.com/huggingface/transformers/pull/38100.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38100.patch",
"merged_at": "2025-06-02T14:15:41"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38095
`apex.amp` is deprecated, so we need to make sure the user have access to `amp` when they set `half_precision_backend = "apex"`. Also we remove import at the beginning of trainer.py file and import it when needed. | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38100/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38100/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38099 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38099/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38099/comments | https://api.github.com/repos/huggingface/transformers/issues/38099/events | https://github.com/huggingface/transformers/pull/38099 | 3,059,670,797 | PR_kwDOCUB6oc6V-yoQ | 38,099 | [smolvlm] skip the test | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T11:12:17 | 2025-05-13T12:50:44 | 2025-05-13T12:50:43 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38099",
"html_url": "https://github.com/huggingface/transformers/pull/38099",
"diff_url": "https://github.com/huggingface/transformers/pull/38099.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38099.patch",
"merged_at": "2025-05-13T12:50:43"
} | # What does this PR do?
As per title, we should skip the test and fix later | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38099/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38098 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38098/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38098/comments | https://api.github.com/repos/huggingface/transformers/issues/38098/events | https://github.com/huggingface/transformers/issues/38098 | 3,059,507,113 | I_kwDOCUB6oc62XF-p | 38,098 | [Bug] Phi-4-multimodal audio processor failed to process multiple audios with close length | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-13T10:12:04 | 2025-05-26T12:41:32 | 2025-05-26T12:41:32 | COLLABORATOR | null | null | null | null | ### System Info
None
### Who can help?
@zucchini-nlp @eustlb
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When migrating the Phi-4-MM implementation to HF-format in vLLM, I found the audio processor failed to process multiple audios in similar length, which is probably a bug in audio processor.
Reproducible example:
```python3
import numpy as np
from transformers import AutoProcessor
# Define model path
model_path = "microsoft/Phi-4-multimodal-instruct"
# Load model and processor
processor = AutoProcessor.from_pretrained(model_path, revision="refs/pr/70")
audio = [np.zeros(512), np.zeros(620)]
inputs = processor(text="<|audio|><|audio|>", audio=audio, sampling_rate=16000)
display(inputs)
```
This will raise error:
```text
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
/tmp/ipykernel_35/3211980912.py in <cell line: 0>()
9
10 audio = [np.zeros(512), np.zeros(620)]
---> 11 inputs = processor(text="<|audio|><|audio|>", audio=audio, sampling_rate=16000)
12
13 display(inputs)
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/processing_phi4_multimodal.py in __call__(self, text, images, audio, **kwargs)
117
118 image_inputs = self.image_processor(images, **image_kwargs) if images is not None else {}
--> 119 audio_inputs = self.audio_processor(audio, **audio_kwargs) if audio is not None else {}
120
121 # We pop here for images as we don't need it later
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/feature_extraction_phi4_multimodal.py in __call__(self, raw_speech, sampling_rate, pad_to_multiple_of, padding, max_length, truncation, return_tensors, return_attention_mask, device, **kwargs)
245 audio_lengths = padded_inputs.audio_lengths
246
--> 247 input_features = self._torch_extract_fbank_features(input_features, audio_lengths, device)
248
249 feature_lengths = (audio_lengths - self.win_length) // self.hop_length + 1
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/feature_extraction_phi4_multimodal.py in _torch_extract_fbank_features(self, waveform, audio_lengths, device)
310 )
311 mask = mask.unsqueeze(-1).expand(-1, -1, self.win_length)
--> 312 masked_frames = frames[to_mask_batch_idxs, offset_idx:max_idx].masked_fill_(mask, 0)
313 frames[to_mask_batch_idxs, offset_idx:max_idx] = masked_frames
314 # ---
RuntimeError: output with shape [1, 1, 400] doesn't match the broadcast shape [1, 3, 400]
```
If we changed these two audio to `length=51000` and `length=51100` respectively, another error will be raised at the same line:
```python3
import numpy as np
from transformers import AutoProcessor
# Define model path
model_path = "microsoft/Phi-4-multimodal-instruct"
# Load model and processor
processor = AutoProcessor.from_pretrained(model_path, revision="refs/pr/70")
audio = [np.zeros(51000), np.zeros(51100)]
inputs = processor(text="<|audio|><|audio|>", audio=audio, sampling_rate=16000)
display(inputs)
```
Error:
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
/tmp/ipykernel_35/4075216193.py in <cell line: 0>()
9
10 audio = [np.zeros(51000), np.zeros(51100)]
---> 11 inputs = processor(text="<|audio|><|audio|>", audio=audio, sampling_rate=16000)
12
13 display(inputs)
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/processing_phi4_multimodal.py in __call__(self, text, images, audio, **kwargs)
117
118 image_inputs = self.image_processor(images, **image_kwargs) if images is not None else {}
--> 119 audio_inputs = self.audio_processor(audio, **audio_kwargs) if audio is not None else {}
120
121 # We pop here for images as we don't need it later
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/feature_extraction_phi4_multimodal.py in __call__(self, raw_speech, sampling_rate, pad_to_multiple_of, padding, max_length, truncation, return_tensors, return_attention_mask, device, **kwargs)
245 audio_lengths = padded_inputs.audio_lengths
246
--> 247 input_features = self._torch_extract_fbank_features(input_features, audio_lengths, device)
248
249 feature_lengths = (audio_lengths - self.win_length) // self.hop_length + 1
/usr/local/lib/python3.11/dist-packages/transformers/models/phi4_multimodal/feature_extraction_phi4_multimodal.py in _torch_extract_fbank_features(self, waveform, audio_lengths, device)
310 )
311 mask = mask.unsqueeze(-1).expand(-1, -1, self.win_length)
--> 312 masked_frames = frames[to_mask_batch_idxs, offset_idx:max_idx].masked_fill_(mask, 0)
313 frames[to_mask_batch_idxs, offset_idx:max_idx] = masked_frames
314 # ---
RuntimeError: The size of tensor a (0) must match the size of tensor b (2) at non-singleton dimension 1
```
### Expected behavior
However, for original processor, these audios can be processed expectedly:
```python3
import numpy as np
from transformers import AutoProcessor
# Define model path
model_path = "microsoft/Phi-4-multimodal-instruct"
# Load model and processor
processor = AutoProcessor.from_pretrained(model_path, trust_remote_code=True)
audio = [(np.zeros(512), 16000), (np.zeros(620), 16000)]
inputs = processor(text="<|audio_1|><|audio_2|>", audios=audio)
print("audio_embed_sizes:", inputs.audio_embed_sizes)
audio = [(np.zeros(51000), 16000), (np.zeros(51100), 16000)]
inputs = processor(text="<|audio_1|><|audio_2|>", audios=audio)
print("audio_embed_sizes:", inputs.audio_embed_sizes)
```
And the outputs are reasonable:
```
audio_embed_sizes: tensor([1, 1])
audio_embed_sizes: tensor([40, 40])
``` | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38098/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/38098/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38097 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38097/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38097/comments | https://api.github.com/repos/huggingface/transformers/issues/38097/events | https://github.com/huggingface/transformers/pull/38097 | 3,058,834,826 | PR_kwDOCUB6oc6V78eY | 38,097 | BatchEncoding.to() get dtype | {
"login": "HERIUN",
"id": 25131767,
"node_id": "MDQ6VXNlcjI1MTMxNzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/25131767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HERIUN",
"html_url": "https://github.com/HERIUN",
"followers_url": "https://api.github.com/users/HERIUN/followers",
"following_url": "https://api.github.com/users/HERIUN/following{/other_user}",
"gists_url": "https://api.github.com/users/HERIUN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HERIUN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HERIUN/subscriptions",
"organizations_url": "https://api.github.com/users/HERIUN/orgs",
"repos_url": "https://api.github.com/users/HERIUN/repos",
"events_url": "https://api.github.com/users/HERIUN/events{/privacy}",
"received_events_url": "https://api.github.com/users/HERIUN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T05:48:39 | 2025-05-13T06:09:50 | 2025-05-13T06:07:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38097",
"html_url": "https://github.com/huggingface/transformers/pull/38097",
"diff_url": "https://github.com/huggingface/transformers/pull/38097.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38097.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
https://github.com/huggingface/transformers/issues/38096
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "HERIUN",
"id": 25131767,
"node_id": "MDQ6VXNlcjI1MTMxNzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/25131767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HERIUN",
"html_url": "https://github.com/HERIUN",
"followers_url": "https://api.github.com/users/HERIUN/followers",
"following_url": "https://api.github.com/users/HERIUN/following{/other_user}",
"gists_url": "https://api.github.com/users/HERIUN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HERIUN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HERIUN/subscriptions",
"organizations_url": "https://api.github.com/users/HERIUN/orgs",
"repos_url": "https://api.github.com/users/HERIUN/repos",
"events_url": "https://api.github.com/users/HERIUN/events{/privacy}",
"received_events_url": "https://api.github.com/users/HERIUN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38097/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38097/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38096 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38096/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38096/comments | https://api.github.com/repos/huggingface/transformers/issues/38096/events | https://github.com/huggingface/transformers/issues/38096 | 3,058,805,797 | I_kwDOCUB6oc62Uawl | 38,096 | BatchEncoding.to(device, dtype) could be worked!! | {
"login": "HERIUN",
"id": 25131767,
"node_id": "MDQ6VXNlcjI1MTMxNzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/25131767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HERIUN",
"html_url": "https://github.com/HERIUN",
"followers_url": "https://api.github.com/users/HERIUN/followers",
"following_url": "https://api.github.com/users/HERIUN/following{/other_user}",
"gists_url": "https://api.github.com/users/HERIUN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HERIUN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HERIUN/subscriptions",
"organizations_url": "https://api.github.com/users/HERIUN/orgs",
"repos_url": "https://api.github.com/users/HERIUN/repos",
"events_url": "https://api.github.com/users/HERIUN/events{/privacy}",
"received_events_url": "https://api.github.com/users/HERIUN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-05-13T05:31:03 | 2025-05-15T02:55:14 | 2025-05-15T02:06:11 | CONTRIBUTOR | null | null | null | null | ### Feature request
BatchEncoding.to() only takes ```device``` and ```non_blocking```. but need to take ```dtype``` also
### Motivation
I tested qwen2.5-3B-instruct model.
but
```python
inputs = processor(**input, return_tensors='pt').to(device=model.device, dtype=model.dtype)
```
can't work. because processor retrun BatchEncoding which doesn't support dtype convert!!
### Your contribution | {
"login": "HERIUN",
"id": 25131767,
"node_id": "MDQ6VXNlcjI1MTMxNzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/25131767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HERIUN",
"html_url": "https://github.com/HERIUN",
"followers_url": "https://api.github.com/users/HERIUN/followers",
"following_url": "https://api.github.com/users/HERIUN/following{/other_user}",
"gists_url": "https://api.github.com/users/HERIUN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HERIUN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HERIUN/subscriptions",
"organizations_url": "https://api.github.com/users/HERIUN/orgs",
"repos_url": "https://api.github.com/users/HERIUN/repos",
"events_url": "https://api.github.com/users/HERIUN/events{/privacy}",
"received_events_url": "https://api.github.com/users/HERIUN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38096/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38096/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38095 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38095/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38095/comments | https://api.github.com/repos/huggingface/transformers/issues/38095/events | https://github.com/huggingface/transformers/issues/38095 | 3,058,649,883 | I_kwDOCUB6oc62T0sb | 38,095 | ImportError: cannot import name 'amp' from 'apex' | {
"login": "Jintao-Huang",
"id": 45290347,
"node_id": "MDQ6VXNlcjQ1MjkwMzQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/45290347?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jintao-Huang",
"html_url": "https://github.com/Jintao-Huang",
"followers_url": "https://api.github.com/users/Jintao-Huang/followers",
"following_url": "https://api.github.com/users/Jintao-Huang/following{/other_user}",
"gists_url": "https://api.github.com/users/Jintao-Huang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jintao-Huang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jintao-Huang/subscriptions",
"organizations_url": "https://api.github.com/users/Jintao-Huang/orgs",
"repos_url": "https://api.github.com/users/Jintao-Huang/repos",
"events_url": "https://api.github.com/users/Jintao-Huang/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jintao-Huang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-13T03:15:56 | 2025-06-02T14:15:42 | 2025-06-02T14:15:42 | CONTRIBUTOR | null | null | null | null | hello!
https://github.com/NVIDIA/apex/issues/1902 | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38095/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38095/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38094 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38094/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38094/comments | https://api.github.com/repos/huggingface/transformers/issues/38094/events | https://github.com/huggingface/transformers/pull/38094 | 3,058,143,660 | PR_kwDOCUB6oc6V5oXC | 38,094 | In Llama4 fix wrongly inverted causal attention mask when using SDPA implementation | {
"login": "sogartar",
"id": 3050579,
"node_id": "MDQ6VXNlcjMwNTA1Nzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/3050579?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sogartar",
"html_url": "https://github.com/sogartar",
"followers_url": "https://api.github.com/users/sogartar/followers",
"following_url": "https://api.github.com/users/sogartar/following{/other_user}",
"gists_url": "https://api.github.com/users/sogartar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sogartar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sogartar/subscriptions",
"organizations_url": "https://api.github.com/users/sogartar/orgs",
"repos_url": "https://api.github.com/users/sogartar/repos",
"events_url": "https://api.github.com/users/sogartar/events{/privacy}",
"received_events_url": "https://api.github.com/users/sogartar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T21:05:58 | 2025-05-20T12:47:59 | 2025-05-20T12:47:59 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38094",
"html_url": "https://github.com/huggingface/transformers/pull/38094",
"diff_url": "https://github.com/huggingface/transformers/pull/38094.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38094.patch",
"merged_at": "2025-05-20T12:47:59"
} | When preparing the causal attention mask at this point the mask comes in as a float tensor with min value as a masked value.
It is not correct to convert it to bool and treat it as a bool mask as this inverts the mask. `torch.nn.functional.scaled_dot_product_attention` expects that a bool masked value is `False`.
I suspect that the `sdpa` implementation variant may not have been thoroughly tested and that is why this error was not caught earlier.
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38094/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38094/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38093 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38093/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38093/comments | https://api.github.com/repos/huggingface/transformers/issues/38093/events | https://github.com/huggingface/transformers/pull/38093 | 3,058,002,323 | PR_kwDOCUB6oc6V5Jsz | 38,093 | update `require_read_token` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T20:06:55 | 2025-05-13T10:07:09 | 2025-05-13T10:07:07 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38093",
"html_url": "https://github.com/huggingface/transformers/pull/38093",
"diff_url": "https://github.com/huggingface/transformers/pull/38093.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38093.patch",
"merged_at": "2025-05-13T10:07:07"
} | # What does this PR do?
update `require_read_token`.
Currently, that decorator only works with function/method, but not for a class. The usage of it on classes will skip all the tests in those classes.
This PR fixes this issue. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38093/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38093/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38092 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38092/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38092/comments | https://api.github.com/repos/huggingface/transformers/issues/38092/events | https://github.com/huggingface/transformers/pull/38092 | 3,057,849,726 | PR_kwDOCUB6oc6V4oeG | 38,092 | Fix InternVL interpolate_pos_encoding and add to video_processing_auto | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T18:52:47 | 2025-05-13T15:18:41 | 2025-05-13T15:18:40 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38092",
"html_url": "https://github.com/huggingface/transformers/pull/38092",
"diff_url": "https://github.com/huggingface/transformers/pull/38092.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38092.patch",
"merged_at": "2025-05-13T15:18:40"
} | # What does this PR do?
Fix small issue with InternVL interpolate_pos_encoding function, and add InternVL to video_processing_auto (currently all integration tests are crashing on main).
Cc @zucchini-nlp for video processor | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38092/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38092/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38091 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38091/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38091/comments | https://api.github.com/repos/huggingface/transformers/issues/38091/events | https://github.com/huggingface/transformers/pull/38091 | 3,057,830,018 | PR_kwDOCUB6oc6V4keD | 38,091 | Don't drop dataset columns for custom collate functions | {
"login": "yaswanth19",
"id": 82788246,
"node_id": "MDQ6VXNlcjgyNzg4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/82788246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaswanth19",
"html_url": "https://github.com/yaswanth19",
"followers_url": "https://api.github.com/users/yaswanth19/followers",
"following_url": "https://api.github.com/users/yaswanth19/following{/other_user}",
"gists_url": "https://api.github.com/users/yaswanth19/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaswanth19/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaswanth19/subscriptions",
"organizations_url": "https://api.github.com/users/yaswanth19/orgs",
"repos_url": "https://api.github.com/users/yaswanth19/repos",
"events_url": "https://api.github.com/users/yaswanth19/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaswanth19/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T18:44:09 | 2025-05-13T12:49:58 | 2025-05-13T12:49:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38091",
"html_url": "https://github.com/huggingface/transformers/pull/38091",
"diff_url": "https://github.com/huggingface/transformers/pull/38091.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38091.patch",
"merged_at": null
} | # What does this PR do?
The existing logic is we inspect model's forward and dataset's return dict keys() and drop unused keys. A simplest scenario where this logic will fail is if I return raw text and label from dataset class and then tokenize in collate_fn. Since model is not expecting the raw text as an input, we drop it but collate func code will break as there is no text to tokenize.
So, don't drop unused cols if a custom collate func is passed as the user might have other intentions and we can safely assume the user can handle any errors caused then on.
Actually the existing logic won't break for the above case if a tuple is passed from dataset to the data collator :sweat_smile: coz we don't know the columns names to drop in that case. So, better to standardize this when we also return dict
CC: @muellerzr for trainer and @ArthurZucker for core stuff | {
"login": "yaswanth19",
"id": 82788246,
"node_id": "MDQ6VXNlcjgyNzg4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/82788246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaswanth19",
"html_url": "https://github.com/yaswanth19",
"followers_url": "https://api.github.com/users/yaswanth19/followers",
"following_url": "https://api.github.com/users/yaswanth19/following{/other_user}",
"gists_url": "https://api.github.com/users/yaswanth19/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaswanth19/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaswanth19/subscriptions",
"organizations_url": "https://api.github.com/users/yaswanth19/orgs",
"repos_url": "https://api.github.com/users/yaswanth19/repos",
"events_url": "https://api.github.com/users/yaswanth19/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaswanth19/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38091/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38090 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38090/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38090/comments | https://api.github.com/repos/huggingface/transformers/issues/38090/events | https://github.com/huggingface/transformers/pull/38090 | 3,057,683,592 | PR_kwDOCUB6oc6V4E5s | 38,090 | Refactor `get_XXX_dataloader` from Trainer | {
"login": "yaswanth19",
"id": 82788246,
"node_id": "MDQ6VXNlcjgyNzg4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/82788246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaswanth19",
"html_url": "https://github.com/yaswanth19",
"followers_url": "https://api.github.com/users/yaswanth19/followers",
"following_url": "https://api.github.com/users/yaswanth19/following{/other_user}",
"gists_url": "https://api.github.com/users/yaswanth19/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaswanth19/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaswanth19/subscriptions",
"organizations_url": "https://api.github.com/users/yaswanth19/orgs",
"repos_url": "https://api.github.com/users/yaswanth19/repos",
"events_url": "https://api.github.com/users/yaswanth19/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaswanth19/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T17:42:45 | 2025-05-19T08:43:27 | 2025-05-19T08:43:27 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38090",
"html_url": "https://github.com/huggingface/transformers/pull/38090",
"diff_url": "https://github.com/huggingface/transformers/pull/38090.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38090.patch",
"merged_at": "2025-05-19T08:43:27"
} | # What does this PR do?
Refactors dataloader functions in Trainer | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38090/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38090/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38089 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38089/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38089/comments | https://api.github.com/repos/huggingface/transformers/issues/38089/events | https://github.com/huggingface/transformers/pull/38089 | 3,057,574,329 | PR_kwDOCUB6oc6V3tN6 | 38,089 | Omit creation of positional IDs within ESM if applicable | {
"login": "simonlevine",
"id": 50503513,
"node_id": "MDQ6VXNlcjUwNTAzNTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/50503513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/simonlevine",
"html_url": "https://github.com/simonlevine",
"followers_url": "https://api.github.com/users/simonlevine/followers",
"following_url": "https://api.github.com/users/simonlevine/following{/other_user}",
"gists_url": "https://api.github.com/users/simonlevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/simonlevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/simonlevine/subscriptions",
"organizations_url": "https://api.github.com/users/simonlevine/orgs",
"repos_url": "https://api.github.com/users/simonlevine/repos",
"events_url": "https://api.github.com/users/simonlevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/simonlevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T16:55:26 | 2025-05-15T14:10:18 | 2025-05-15T14:09:21 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38089",
"html_url": "https://github.com/huggingface/transformers/pull/38089",
"diff_url": "https://github.com/huggingface/transformers/pull/38089.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38089.patch",
"merged_at": "2025-05-15T14:09:21"
} | null | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38089/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38089/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38088 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38088/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38088/comments | https://api.github.com/repos/huggingface/transformers/issues/38088/events | https://github.com/huggingface/transformers/pull/38088 | 3,057,561,140 | PR_kwDOCUB6oc6V3qY6 | 38,088 | Disable report callbacks for certain training tests | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T16:49:29 | 2025-05-13T12:49:58 | 2025-05-13T12:49:56 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38088",
"html_url": "https://github.com/huggingface/transformers/pull/38088",
"diff_url": "https://github.com/huggingface/transformers/pull/38088.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38088.patch",
"merged_at": "2025-05-13T12:49:56"
} | This fixes the `No module named 'pynvml'` error in the affected tests on non-cuda devices.
~It does not address `test_auto_batch_size_finder` however, as that test runs `run_glue.py` directly.~
I tried adding `--report_to none` as an argument to the `run_glue.py` call. I'll check if it actually works. | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38088/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38088/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38087 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38087/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38087/comments | https://api.github.com/repos/huggingface/transformers/issues/38087/events | https://github.com/huggingface/transformers/pull/38087 | 3,057,325,406 | PR_kwDOCUB6oc6V23qH | 38,087 | Add optional RMSNorm support to BitNet quantization (config + layers) | {
"login": "Codys12",
"id": 41397239,
"node_id": "MDQ6VXNlcjQxMzk3MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/41397239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Codys12",
"html_url": "https://github.com/Codys12",
"followers_url": "https://api.github.com/users/Codys12/followers",
"following_url": "https://api.github.com/users/Codys12/following{/other_user}",
"gists_url": "https://api.github.com/users/Codys12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Codys12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Codys12/subscriptions",
"organizations_url": "https://api.github.com/users/Codys12/orgs",
"repos_url": "https://api.github.com/users/Codys12/repos",
"events_url": "https://api.github.com/users/Codys12/events{/privacy}",
"received_events_url": "https://api.github.com/users/Codys12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T15:15:43 | 2025-05-16T10:38:21 | 2025-05-16T10:38:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38087",
"html_url": "https://github.com/huggingface/transformers/pull/38087",
"diff_url": "https://github.com/huggingface/transformers/pull/38087.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38087.patch",
"merged_at": "2025-05-16T10:38:06"
} | # What does this PR do?
**Adds optional RMSNorm support to BitNet-style quantisation.**
* Introduces `use_rms_norm` (bool, default False) and `rms_norm_eps` (float, default `1e-6`) to **`BitNetQuantConfig`** so the flag is serialisable via `save_pretrained / from_pretrained`.
* Updates **`BitLinear`** and **`AutoBitLinear`** to accept `use_rms_norm` and apply the reference `BitNetRMSNorm` to activations *before* quantisation.
## Before submitting
- [x] I read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), PR section.
- [x] I’ve added the new config fields to `to_dict`, docstrings, and the model card.
- [ ] New unit tests
- [x] Ran `make style && make quality && make test` locally.
- [ ] Documentation build passes (`make docs`) – pushed logs to CI.
## Motivation and context
RMSNorm stabilises the activations of low-bit networks; the BitNet paper shows a consistent perplexity drop when normalising pre-quant activations. This PR brings parity with the reference implementation while keeping the previous behaviour as default.
No new external dependencies.
## Who can review?
Quantization / Accelerate folks for the code:
@SunMarc @MekkCyber
Docstrings & config: @stevhliu
Feel free to jump in with any feedback!
| {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38087/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38087/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38086 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38086/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38086/comments | https://api.github.com/repos/huggingface/transformers/issues/38086/events | https://github.com/huggingface/transformers/pull/38086 | 3,057,322,778 | PR_kwDOCUB6oc6V23GR | 38,086 | Refactor `MambaCache` to `modeling_mamba.py` | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T15:14:46 | 2025-07-21T12:59:36 | 2025-07-21T12:59:36 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38086",
"html_url": "https://github.com/huggingface/transformers/pull/38086",
"diff_url": "https://github.com/huggingface/transformers/pull/38086.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38086.patch",
"merged_at": "2025-07-21T12:59:36"
} | This PR moves the specialized `MambaCache` class from `cache_utils.py` to `src/transformers/models/mamba/modeling_mamba.py`. This is preliminary work for #38077.
**Changes:**
- Moved `MambaCache` to its own file, aligning with Zamba, Bamba, etc.
- Removed unnecessary Mamba-specific code from `generate`.
- Added Mamba cache init to `preparing_inits_from_generation` from `forward()`. See [this comment](https://github.com/huggingface/transformers/pull/38086#discussion_r2107581297).
- Why? Bamba, Jamba, GraniteMoeHybrid, Zamba, and Zamba2 had settled into initializing custom caches in `preparing_inits_from_generation` and only Mamba, Mamba2, and FalconMamba were doing it in `forward`, which is bad for torch.compile.
- We dont break BC with any import (thanks joao for the idea!)
- Cleaned up some Mamba and FalconMamba slow tests, which had been failing on main for a long time.
- Removed DDP Mamba tests. I had a DDP implementation for Mamba in 66b7162c7c5c5ce8a73ccf48cffc8a96343ebb33 so tests passed but removed it since DDP is not needed in Mamba, as per Joaos's instructions. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38086/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38086/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38085 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38085/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38085/comments | https://api.github.com/repos/huggingface/transformers/issues/38085/events | https://github.com/huggingface/transformers/pull/38085 | 3,057,274,957 | PR_kwDOCUB6oc6V2sot | 38,085 | Add CB | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T14:59:10 | 2025-05-23T15:25:14 | 2025-05-22T15:43:49 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38085",
"html_url": "https://github.com/huggingface/transformers/pull/38085",
"diff_url": "https://github.com/huggingface/transformers/pull/38085.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38085.patch",
"merged_at": "2025-05-22T15:43:48"
} | # What does this PR do?
Snippet:
```python
import time
import datasets
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
from transformers.generation import GenerationConfig
torch.set_float32_matmul_precision("high")
model_id = "meta-llama/Llama-3.2-3b-Instruct"
model = AutoModelForCausalLM.from_pretrained(
model_id, attn_implementation="paged_attention", torch_dtype=torch.bfloat16, device_map="auto"
).eval()
tokenizer = AutoTokenizer.from_pretrained(model_id, padding_side="left")
generation_config = GenerationConfig(
max_new_tokens=512,
eos_token_id=tokenizer.eos_token_id,
pad_token_id=tokenizer.pad_token_id,
use_cache=False,
num_blocks=2048,
block_size=128,
do_sample=True,
max_batch_tokens=1024, # Maximum number of tokens to process in a single batch
)
train_dataset = datasets.load_dataset("openai/gsm8k", "socratic", split="test")
# --- Example 1: Simple Version using generate_batch ---
print("--- Running CB Generation Example ---")
def tokenize_function(examples):
return tokenizer(examples["question"])
tokenized_datasets = train_dataset.map(tokenize_function, batched=True)
simple_batch_inputs = [item["input_ids"] for item in tokenized_datasets]
start_time_simple = time.time()
# model.forward = torch.compile(model.forward, mode="max-autotune-no-cudagraphs", fullgraph=True)
batch_outputs = model.generate_batch(
inputs=simple_batch_inputs,
generation_config=generation_config,
)
end_time_simple = time.time()
for request in batch_outputs:
input_text = tokenizer.decode(batch_outputs[request].prompt_ids, skip_special_tokens=False)
try:
output_text = tokenizer.decode(batch_outputs[request].generated_tokens, skip_special_tokens=False)
except Exception as e:
print(f"Decoding failed for request {request}: {e}")
output_text = tokenizer.decode(batch_outputs[request].generated_tokens[1:], skip_special_tokens=False)
if len(output_text) > 0:
print("-" * 20)
print(f"{request} Input: {input_text}")
print(f"{request} Output: {output_text}")
else:
print("", end="\r\r\r\r")
print("-" * 20)
print("--- Finished CB Generation Example ---\n\n")
print(f"CB generation took: {end_time_simple - start_time_simple:.2f} seconds")
``` | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38085/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38085/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38084 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38084/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38084/comments | https://api.github.com/repos/huggingface/transformers/issues/38084/events | https://github.com/huggingface/transformers/pull/38084 | 3,057,253,233 | PR_kwDOCUB6oc6V2n5o | 38,084 | fix multi-image case for llava-onevision | {
"login": "cyr0930",
"id": 14088169,
"node_id": "MDQ6VXNlcjE0MDg4MTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/14088169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyr0930",
"html_url": "https://github.com/cyr0930",
"followers_url": "https://api.github.com/users/cyr0930/followers",
"following_url": "https://api.github.com/users/cyr0930/following{/other_user}",
"gists_url": "https://api.github.com/users/cyr0930/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyr0930/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyr0930/subscriptions",
"organizations_url": "https://api.github.com/users/cyr0930/orgs",
"repos_url": "https://api.github.com/users/cyr0930/repos",
"events_url": "https://api.github.com/users/cyr0930/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyr0930/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T14:51:41 | 2025-05-21T09:52:54 | 2025-05-21T09:50:46 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38084",
"html_url": "https://github.com/huggingface/transformers/pull/38084",
"diff_url": "https://github.com/huggingface/transformers/pull/38084.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38084.patch",
"merged_at": "2025-05-21T09:50:46"
} | # What does this PR do?
llava-onevision should not use anyres patching for multi-image case
Fixes https://github.com/huggingface/transformers/issues/34585
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38084/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38084/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38083 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38083/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38083/comments | https://api.github.com/repos/huggingface/transformers/issues/38083/events | https://github.com/huggingface/transformers/pull/38083 | 3,056,985,228 | PR_kwDOCUB6oc6V1tJw | 38,083 | uninstall `kernels` from docker images | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T13:28:44 | 2025-05-12T16:03:48 | 2025-05-12T16:03:47 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38083",
"html_url": "https://github.com/huggingface/transformers/pull/38083",
"diff_url": "https://github.com/huggingface/transformers/pull/38083.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38083.patch",
"merged_at": "2025-05-12T16:03:47"
} | # What does this PR do?
`kernels` may give different outputs (within 1e-5 range) even with the same model (weights) and the same inputs.
https://huggingface.slack.com/archives/C01NE71C4F7/p1747055897244909 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38083/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38083/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38081 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38081/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38081/comments | https://api.github.com/repos/huggingface/transformers/issues/38081/events | https://github.com/huggingface/transformers/pull/38081 | 3,056,632,055 | PR_kwDOCUB6oc6V0f-H | 38,081 | Fix mt5 test on AMD devices | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T11:27:11 | 2025-05-12T14:59:02 | 2025-05-12T14:59:00 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38081",
"html_url": "https://github.com/huggingface/transformers/pull/38081",
"diff_url": "https://github.com/huggingface/transformers/pull/38081.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38081.patch",
"merged_at": "2025-05-12T14:59:00"
} | The result of this test on AMD is 0.000103, which the test correctly catches as being larger than 0.0001, but it's close enough for the purposes of this test so I simply increased the limit to 0.0002. | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38081/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38081/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38080 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38080/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38080/comments | https://api.github.com/repos/huggingface/transformers/issues/38080/events | https://github.com/huggingface/transformers/pull/38080 | 3,056,575,600 | PR_kwDOCUB6oc6V0TpD | 38,080 | [gemma3] fix bidirectional attention mask | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T11:05:06 | 2025-05-20T15:35:05 | 2025-05-20T15:35:05 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38080",
"html_url": "https://github.com/huggingface/transformers/pull/38080",
"diff_url": "https://github.com/huggingface/transformers/pull/38080.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38080.patch",
"merged_at": "2025-05-20T15:35:05"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38053. Images cannot attend to future images but in current implementation we unmask attention within all images
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38080/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38079 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38079/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38079/comments | https://api.github.com/repos/huggingface/transformers/issues/38079/events | https://github.com/huggingface/transformers/pull/38079 | 3,056,566,834 | PR_kwDOCUB6oc6V0Rsb | 38,079 | Add AMD expectation to test_gpt2_sample | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T11:01:52 | 2025-05-12T14:51:24 | 2025-05-12T14:51:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38079",
"html_url": "https://github.com/huggingface/transformers/pull/38079",
"diff_url": "https://github.com/huggingface/transformers/pull/38079.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38079.patch",
"merged_at": "2025-05-12T14:51:22"
} | Fixes failing test via expectations util | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38079/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38079/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38078 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38078/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38078/comments | https://api.github.com/repos/huggingface/transformers/issues/38078/events | https://github.com/huggingface/transformers/issues/38078 | 3,056,155,847 | I_kwDOCUB6oc62KTzH | 38,078 | autoawq has been deprecated. Is it possible to support the use of llm-compresser as an alternative to autoawq | {
"login": "0O0OwO0O0",
"id": 169051867,
"node_id": "U_kgDOChOG2w",
"avatar_url": "https://avatars.githubusercontent.com/u/169051867?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/0O0OwO0O0",
"html_url": "https://github.com/0O0OwO0O0",
"followers_url": "https://api.github.com/users/0O0OwO0O0/followers",
"following_url": "https://api.github.com/users/0O0OwO0O0/following{/other_user}",
"gists_url": "https://api.github.com/users/0O0OwO0O0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/0O0OwO0O0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/0O0OwO0O0/subscriptions",
"organizations_url": "https://api.github.com/users/0O0OwO0O0/orgs",
"repos_url": "https://api.github.com/users/0O0OwO0O0/repos",
"events_url": "https://api.github.com/users/0O0OwO0O0/events{/privacy}",
"received_events_url": "https://api.github.com/users/0O0OwO0O0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T08:45:53 | 2025-07-30T11:53:30 | 2025-06-12T12:04:36 | NONE | null | null | null | null | autoawq has been deprecated. Is it possible to support the use of llm-compresser as an alternative to autoawq, or can another alternative be provided
</br></br>
from [Autoawq](https://github.com/casper-hansen/AutoAWQ)
```markdown
Important Notice:
- AutoAWQ is officially deprecated and will no longer be maintained.
- The last tested configuration used Torch 2.6.0 and Transformers 4.51.3.
- If future versions of Transformers break AutoAWQ compatibility, please report the issue to the Transformers project.
Alternative:
- AutoAWQ has been adopted by the vLLM Project: https://github.com/vllm-project/llm-compressor
- MLX-LM now supports AWQ for Mac devices: http://github.com/ml-explore/mlx-lm
For further inquiries, feel free to reach out:
- X: https://x.com/casper_hansen_
- LinkedIn: https://www.linkedin.com/in/casper-hansen-804005170/
``` | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38078/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38078/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38077 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38077/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38077/comments | https://api.github.com/repos/huggingface/transformers/issues/38077/events | https://github.com/huggingface/transformers/pull/38077 | 3,056,010,977 | PR_kwDOCUB6oc6VyXuC | 38,077 | Cache System Refactor: Layered Architecture | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T08:00:44 | 2025-09-18T13:46:53 | 2025-09-18T13:46:53 | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38077",
"html_url": "https://github.com/huggingface/transformers/pull/38077",
"diff_url": "https://github.com/huggingface/transformers/pull/38077.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38077.patch",
"merged_at": null
} | ## Overview
This PR introduces a layered architecture for the cache system, allowing for better composition of caches.
## Current class structure
```mermaid
graph TD
B[Cache]
B --> F[StaticCache]
B --> I[HybridCache]
B --> G[DynamicCache]
F --> H[SlidingWindowCache]
F --> P[OffloadedStaticCache]
B --> M[EncoderDecoderCache]
B --> N[HybridChunkedCache]
N --> O[OffloadedHybridCache]
G --> J[QuantizedCache]
J --> Q[QuantoQuantizedCache]
J --> R[HQQQuantizedCache]
G --> K[OffloadedCache]
B --> L[MambaCache]
```
## Goals
1. Replace all existing cache implementations with layered caches
2. Enable model-specific cache configurations through layer composition
3. Improve test coverage and maintainability
4. Reduce code duplication through shared layer implementations
## New structure
- Cache
- Keeps a `layers` list of `CacheLayer` instances.
- Dynamically delegates method and attibute calls to the layers (e.g., crop(), reset(), is_compilable, etc)
- CacheLayer
- Base type for all layers
- Examples:
- `StaticLayer`
- `DynamicLayer`
- ...
- StaticCache or DynamicCache are now empty shells for BC that just define their layer type:
```py
class DynamicCache(Cache):
pattern_block = (DynamicLayer,)
```
- Offloading, quantization, etc. are pluggable CacheProcessors that can wrap any cache, static or dynamic.
## Layer patterns & method propagation
Every cache class is now defined just by a pattern_block, a tuple of CacheLayer subclasses that should repeat across depth. The base Cache instantiates layer_types = [pattern_block[i % len(pattern_block)] for i in range(config.num_layers)], so e.g. pattern_block = (StaticLayer, SlidingWindowLayer) yields an alternating Static/Sliding schedule.
Anything not found on the cache itself is forwarded automatically: if it’s an attribute, the cache returns the unique value across the first full pattern (or errors if they differ); if it’s a method, the cache builds a dispatcher that calls the method on every layer, threading a state object and respecting each layer’s return_early flag. This removes almost all boilerplate for ops like reset(), crop(), get_mask_sizes(), batch_split(), etc., and lets new layer types slot in without touching the main classes.
## Progress
### Part 1: Porting to layered classes. No new functionality, just refactor into new classes.
- [x] New base CacheLayer and (layered) Cache classes
- [x] StaticCache port
- [x] DynamicCache port and tests
- [x] Stripe functionality from DynamicCache and StaticCache into the layers, add error handling.
- [x] kill SinkCache
- [x] Define hook system for offloading, quantization without specialized classes.
- [x] OffloadedCache
- [x] SlidingWindowCache port
- [x] make cache exportable initializing correct types of layers
- [x] QuantizedCache port
- [x] EncoderDecoderCache port
- [x] HybridCache port
- [x] Replace Cache with LayeredCache
- [x] test mllama, layerskip-llama.
- [x] bring back SinkCache as custom decode on the Hub.
- [ ] test #38156 on SlidingCache
- [ ] run and fix all models tests.
- [ ] run some benchmarks to confirm no perf degrade.
### Part 2: Improvements, new incremental features
- [ ] Mark cache_position as mandatory and start deprecation cycle.
- [ ] Check if [torch.cond optimization](https://github.com/huggingface/transformers/pull/37972#discussion_r2086967000) for small sentences speeds up torch.compiled generation ([partial commit](https://github.com/huggingface/transformers/commit/901c2a47155b6fdc7c87f9d5dd6ec8b937fc744e)).
- [ ] Check if casts are needed only for GPT-J, move that to model code. See [this](https://github.com/huggingface/transformers/pull/37972#discussion_r2085597469).
- [ ] Refactor and document Llama4's ChunkedAttention and hybrid approach into layers.
### Part 3: Config based cache composition
- [ ] Design layer composition system based on configurations instead of Cache classes.
- [ ] Port Hybrid caches to use file definitions.
- [ ] Update documentation with new configuration options
## Tests review
- [x] StaticCache tests
- [x] DynamicCache tests
- [x] OffloadedCache tests
- [ ] SlidingWindowCache tests
- [ ] QuantizedCache tests
- [ ] MambaCache tests
- [ ] EncoderDecoderCache tests
- [x] HybridCache tests
### Note
The code builds on #37972
| {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38077/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38077/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38076 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38076/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38076/comments | https://api.github.com/repos/huggingface/transformers/issues/38076/events | https://github.com/huggingface/transformers/pull/38076 | 3,055,859,678 | PR_kwDOCUB6oc6Vx2sc | 38,076 | Fix temporal padding in Qwen2VLImageProcessor when the number of frames is not divisible by temporal_patch_size | {
"login": "ritwickchaudhry",
"id": 11583361,
"node_id": "MDQ6VXNlcjExNTgzMzYx",
"avatar_url": "https://avatars.githubusercontent.com/u/11583361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ritwickchaudhry",
"html_url": "https://github.com/ritwickchaudhry",
"followers_url": "https://api.github.com/users/ritwickchaudhry/followers",
"following_url": "https://api.github.com/users/ritwickchaudhry/following{/other_user}",
"gists_url": "https://api.github.com/users/ritwickchaudhry/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ritwickchaudhry/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ritwickchaudhry/subscriptions",
"organizations_url": "https://api.github.com/users/ritwickchaudhry/orgs",
"repos_url": "https://api.github.com/users/ritwickchaudhry/repos",
"events_url": "https://api.github.com/users/ritwickchaudhry/events{/privacy}",
"received_events_url": "https://api.github.com/users/ritwickchaudhry/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T07:01:10 | 2025-05-14T10:28:22 | 2025-05-14T10:28:22 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38076",
"html_url": "https://github.com/huggingface/transformers/pull/38076",
"diff_url": "https://github.com/huggingface/transformers/pull/38076.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38076.patch",
"merged_at": "2025-05-14T10:28:22"
} | This PR fixes an issue in Qwen2VLImageProcessor where the current implementation does not correctly handle cases when the number of video frames is not divisible by `temporal_patch_size`.
### Problem:
The existing logic repeats the last frame `temporal_patch_size` - 1 times. This works correctly when `temporal_patch_size` equals 2 but fails when the size is greater.
### Solution:
The fix replaces:
```repeats = np.repeat(patches[-1][np.newaxis], temporal_patch_size - 1, axis=0)```
with:
```repeats = np.repeat(patches[-1][np.newaxis], temporal_patch_size - (patches.shape[0] % temporal_patch_size), axis=0)```
This ensures that the correct number of padding frames are added when the frame count is not divisible by the `temporal_patch_size`.
#### Additional Changes:
Added a unit test to verify the padding logic for edge cases where the number of frames is not divisible by the patch size.
#### Issue Reference:
Fixes #38003 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38076/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38076/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38075 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38075/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38075/comments | https://api.github.com/repos/huggingface/transformers/issues/38075/events | https://github.com/huggingface/transformers/pull/38075 | 3,055,596,874 | PR_kwDOCUB6oc6Vw9lp | 38,075 | [typo] qwen2_5_vl in modeling_utils.py | {
"login": "JJJYmmm",
"id": 92386084,
"node_id": "U_kgDOBYGzJA",
"avatar_url": "https://avatars.githubusercontent.com/u/92386084?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JJJYmmm",
"html_url": "https://github.com/JJJYmmm",
"followers_url": "https://api.github.com/users/JJJYmmm/followers",
"following_url": "https://api.github.com/users/JJJYmmm/following{/other_user}",
"gists_url": "https://api.github.com/users/JJJYmmm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JJJYmmm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JJJYmmm/subscriptions",
"organizations_url": "https://api.github.com/users/JJJYmmm/orgs",
"repos_url": "https://api.github.com/users/JJJYmmm/repos",
"events_url": "https://api.github.com/users/JJJYmmm/events{/privacy}",
"received_events_url": "https://api.github.com/users/JJJYmmm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T04:31:21 | 2025-05-12T09:45:52 | 2025-05-12T09:45:52 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38075",
"html_url": "https://github.com/huggingface/transformers/pull/38075",
"diff_url": "https://github.com/huggingface/transformers/pull/38075.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38075.patch",
"merged_at": null
} | # What does this PR do?
Fixes https://github.com/QwenLM/Qwen2.5-VL/issues/1195 just a small typo🤗
Related PR #37033 @zucchini-nlp | {
"login": "JJJYmmm",
"id": 92386084,
"node_id": "U_kgDOBYGzJA",
"avatar_url": "https://avatars.githubusercontent.com/u/92386084?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JJJYmmm",
"html_url": "https://github.com/JJJYmmm",
"followers_url": "https://api.github.com/users/JJJYmmm/followers",
"following_url": "https://api.github.com/users/JJJYmmm/following{/other_user}",
"gists_url": "https://api.github.com/users/JJJYmmm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JJJYmmm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JJJYmmm/subscriptions",
"organizations_url": "https://api.github.com/users/JJJYmmm/orgs",
"repos_url": "https://api.github.com/users/JJJYmmm/repos",
"events_url": "https://api.github.com/users/JJJYmmm/events{/privacy}",
"received_events_url": "https://api.github.com/users/JJJYmmm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38075/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38075/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38074 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38074/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38074/comments | https://api.github.com/repos/huggingface/transformers/issues/38074/events | https://github.com/huggingface/transformers/pull/38074 | 3,055,515,019 | PR_kwDOCUB6oc6Vwr6N | 38,074 | Fix description and formatting errors in code docs | {
"login": "bilibili12433014",
"id": 73748897,
"node_id": "MDQ6VXNlcjczNzQ4ODk3",
"avatar_url": "https://avatars.githubusercontent.com/u/73748897?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bilibili12433014",
"html_url": "https://github.com/bilibili12433014",
"followers_url": "https://api.github.com/users/bilibili12433014/followers",
"following_url": "https://api.github.com/users/bilibili12433014/following{/other_user}",
"gists_url": "https://api.github.com/users/bilibili12433014/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bilibili12433014/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bilibili12433014/subscriptions",
"organizations_url": "https://api.github.com/users/bilibili12433014/orgs",
"repos_url": "https://api.github.com/users/bilibili12433014/repos",
"events_url": "https://api.github.com/users/bilibili12433014/events{/privacy}",
"received_events_url": "https://api.github.com/users/bilibili12433014/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T03:19:18 | 2025-05-13T17:17:16 | 2025-05-13T17:17:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38074",
"html_url": "https://github.com/huggingface/transformers/pull/38074",
"diff_url": "https://github.com/huggingface/transformers/pull/38074.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38074.patch",
"merged_at": "2025-05-13T17:17:16"
} | # What does this PR do?
Fix description and formatting errors:
1. Fix: Two `True` exists in one conditional statement.
2. Removed line breaks to comply with the doc guideline: 'No need to indent further for the elements building the return.'
Origin:
```
Return:
`torch.BoolTensor`. (`torch.BoolTensor` of shape `(batch_size, 1)`), where `True` indicates we stop generation
for a particular row, `True` indicates we should continue.
```
Fix:
```
Return:
`torch.BoolTensor`. (`torch.BoolTensor` of shape `(batch_size, 1)`), where `True` indicates we stop generation for a particular row, `False` indicates we should continue.
```
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38074/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38074/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38073 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38073/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38073/comments | https://api.github.com/repos/huggingface/transformers/issues/38073/events | https://github.com/huggingface/transformers/pull/38073 | 3,055,432,647 | PR_kwDOCUB6oc6VwaPt | 38,073 | add timeout for downloading the `librispeech_asr` dataset | {
"login": "faaany",
"id": 24477841,
"node_id": "MDQ6VXNlcjI0NDc3ODQx",
"avatar_url": "https://avatars.githubusercontent.com/u/24477841?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/faaany",
"html_url": "https://github.com/faaany",
"followers_url": "https://api.github.com/users/faaany/followers",
"following_url": "https://api.github.com/users/faaany/following{/other_user}",
"gists_url": "https://api.github.com/users/faaany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/faaany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/faaany/subscriptions",
"organizations_url": "https://api.github.com/users/faaany/orgs",
"repos_url": "https://api.github.com/users/faaany/repos",
"events_url": "https://api.github.com/users/faaany/events{/privacy}",
"received_events_url": "https://api.github.com/users/faaany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-12T02:06:06 | 2025-05-13T10:50:13 | 2025-05-13T10:50:13 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38073",
"html_url": "https://github.com/huggingface/transformers/pull/38073",
"diff_url": "https://github.com/huggingface/transformers/pull/38073.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38073.patch",
"merged_at": "2025-05-13T10:50:13"
} | ## What does this PR do?
When running the `run_wav2vec2_pretraining_no_trainer.py` example, I got `fsspec.exceptions.FSTimeoutError`.
The reason is that the default timeout (5 * 60s) for the `aiohttp` library used internally by fsspec is too short for large files, e.g. 5.9G.
I also found an issue discussed in the dataset library, which resolved the issue by the same approach: https://github.com/huggingface/datasets/issues/7164.
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38073/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38073/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38072 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38072/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38072/comments | https://api.github.com/repos/huggingface/transformers/issues/38072/events | https://github.com/huggingface/transformers/pull/38072 | 3,055,248,292 | PR_kwDOCUB6oc6Vv0dz | 38,072 | Updated the Model docs - for the ALIGN model | {
"login": "1himan",
"id": 140396762,
"node_id": "U_kgDOCF5I2g",
"avatar_url": "https://avatars.githubusercontent.com/u/140396762?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/1himan",
"html_url": "https://github.com/1himan",
"followers_url": "https://api.github.com/users/1himan/followers",
"following_url": "https://api.github.com/users/1himan/following{/other_user}",
"gists_url": "https://api.github.com/users/1himan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/1himan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1himan/subscriptions",
"organizations_url": "https://api.github.com/users/1himan/orgs",
"repos_url": "https://api.github.com/users/1himan/repos",
"events_url": "https://api.github.com/users/1himan/events{/privacy}",
"received_events_url": "https://api.github.com/users/1himan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-11T22:04:43 | 2025-05-29T09:13:30 | 2025-05-28T16:19:09 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38072",
"html_url": "https://github.com/huggingface/transformers/pull/38072",
"diff_url": "https://github.com/huggingface/transformers/pull/38072.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38072.patch",
"merged_at": "2025-05-28T16:19:09"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38072/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38072/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38071 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38071/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38071/comments | https://api.github.com/repos/huggingface/transformers/issues/38071/events | https://github.com/huggingface/transformers/issues/38071 | 3,055,176,870 | I_kwDOCUB6oc62Gkym | 38,071 | transformers showing decoder model architecture detected so padding should be left | {
"login": "sleepingcat4",
"id": 81933585,
"node_id": "MDQ6VXNlcjgxOTMzNTg1",
"avatar_url": "https://avatars.githubusercontent.com/u/81933585?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sleepingcat4",
"html_url": "https://github.com/sleepingcat4",
"followers_url": "https://api.github.com/users/sleepingcat4/followers",
"following_url": "https://api.github.com/users/sleepingcat4/following{/other_user}",
"gists_url": "https://api.github.com/users/sleepingcat4/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sleepingcat4/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sleepingcat4/subscriptions",
"organizations_url": "https://api.github.com/users/sleepingcat4/orgs",
"repos_url": "https://api.github.com/users/sleepingcat4/repos",
"events_url": "https://api.github.com/users/sleepingcat4/events{/privacy}",
"received_events_url": "https://api.github.com/users/sleepingcat4/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-11T19:45:23 | 2025-07-09T08:02:51 | 2025-07-09T08:02:51 | NONE | null | null | null | null | ### System Info
I had implemented some batching both using pipeline and model.generate method and for Qwen3 and Llama4 models it shows decoder only model detected so padding should mentioned in left. Which is weird as I inspected the model output and it was totally fine.
However when I mentioned padding to be left, Qwen3 class returned an error. it is very weird. Can this waring/bug be fixed?
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
To reproduce please use my below code:
```
def load_pipeline(model_name: str):
pipe = pipeline(
"text-generation",
model=model_name,
model_kwargs={"torch_dtype": torch.bfloat16, "attn_implementation": "flash_attention_2"},
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
return pipe, tokenizer
system_prompt = (
"Think step by step to answer the following question.\n"
"Return the answer for the MC (Multiple Choice) question at the end of the response after a separator #####.\n"
)
def zero_eval(df, pipe, tokenizer, agieval=False, batch_size=4):
full_prompts = []
for _, row in tqdm(df.iterrows(), total=len(df)):
question = row["question"]
if agieval:
choices_str = row["answer"]
else:
choices = eval(row["options"])
choices_str = "\n".join([f"({chr(65+i)}) {c}" for i, c in enumerate(choices)])
user_prompt = f"---\nQ: {question}\nChoices:\n{choices_str}\nA:"
full_prompts.append(user_prompt)
batch_messages = [[
{"role": "system", "content": system_prompt},
{"role": "user", "content": prompt}
] for prompt in full_prompts]
prompts = [
tokenizer.apply_chat_template(msg, tokenize=False, enable_thinking=False, add_generation_prompt=True)
for msg in batch_messages
]
outputs = []
for i in tqdm(range(0, len(prompts), batch_size)):
batch = prompts[i:i+batch_size]
result = pipe(batch, batch_size=batch_size, max_new_tokens=2035)
outputs.extend([r[0]["generated_text"] for r in result])
df["full_prompt"] = full_prompts
df["model_output"] = outputs
model_name = pipe.model.config.name_or_path.split("/")[-1].replace(":", "_")
timestamp = datetime.now().strftime("%H-%M_%Y-%m-%d")
folder_name = model_name
os.makedirs(folder_name, exist_ok=True)
file_name = f"{model_name}_{timestamp}.csv"
file_path = os.path.join(folder_name, file_name)
df.to_csv(file_path, index=False)
```
### Expected behavior
When you will run the code, the warning will be shown for Qwen3 models and Llama4 models. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38071/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38070 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38070/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38070/comments | https://api.github.com/repos/huggingface/transformers/issues/38070/events | https://github.com/huggingface/transformers/issues/38070 | 3,055,052,194 | I_kwDOCUB6oc62GGWi | 38,070 | Typo in modeling_utils.py causing checkpoint loading error with Qwen2.5-VL | {
"login": "tanghme0w",
"id": 67302932,
"node_id": "MDQ6VXNlcjY3MzAyOTMy",
"avatar_url": "https://avatars.githubusercontent.com/u/67302932?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tanghme0w",
"html_url": "https://github.com/tanghme0w",
"followers_url": "https://api.github.com/users/tanghme0w/followers",
"following_url": "https://api.github.com/users/tanghme0w/following{/other_user}",
"gists_url": "https://api.github.com/users/tanghme0w/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tanghme0w/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tanghme0w/subscriptions",
"organizations_url": "https://api.github.com/users/tanghme0w/orgs",
"repos_url": "https://api.github.com/users/tanghme0w/repos",
"events_url": "https://api.github.com/users/tanghme0w/events{/privacy}",
"received_events_url": "https://api.github.com/users/tanghme0w/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-11T15:24:42 | 2025-05-12T10:14:05 | 2025-05-12T10:14:05 | NONE | null | null | null | null | ### System Info
Version: 4.52.0.dev0
In modeling_utils.py, line 236, qwen2_5_vl is misspelled as qwem2_5_vl
<img width="487" alt="Image" src="https://github.com/user-attachments/assets/9a0a6c2b-fc6f-4c17-9a5c-926c67cea597" />
This will cause the load of _checkpoint_conversion_mapping in line 4113 being skipped and thereby causing param names mismatch during checkpoint load.
<img width="864" alt="Image" src="https://github.com/user-attachments/assets/704a26b2-a035-48ac-9ac7-58a034cd5bc9" />
### Reproduction
simply run the from_pretrained method of Qwen2_5_VLForConditionalGeneration to reproduce.
···
from transformers import Qwen2_5_VLForConditionalGeneration
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
"Qwen/Qwen2.5-VL-32B-Instruct", torch_dtype="auto", device_map="auto"
)
···
### Expected behavior
Will report missing parameters and the model will be initialized with random weights. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38070/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38070/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38069 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38069/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38069/comments | https://api.github.com/repos/huggingface/transformers/issues/38069/events | https://github.com/huggingface/transformers/pull/38069 | 3,054,917,772 | PR_kwDOCUB6oc6Vuzm5 | 38,069 | fix the inconsist docstring in apply_chat_template | {
"login": "lenijwp",
"id": 32331828,
"node_id": "MDQ6VXNlcjMyMzMxODI4",
"avatar_url": "https://avatars.githubusercontent.com/u/32331828?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lenijwp",
"html_url": "https://github.com/lenijwp",
"followers_url": "https://api.github.com/users/lenijwp/followers",
"following_url": "https://api.github.com/users/lenijwp/following{/other_user}",
"gists_url": "https://api.github.com/users/lenijwp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lenijwp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lenijwp/subscriptions",
"organizations_url": "https://api.github.com/users/lenijwp/orgs",
"repos_url": "https://api.github.com/users/lenijwp/repos",
"events_url": "https://api.github.com/users/lenijwp/events{/privacy}",
"received_events_url": "https://api.github.com/users/lenijwp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-11T11:05:51 | 2025-05-12T15:32:02 | 2025-05-12T15:32:02 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38069",
"html_url": "https://github.com/huggingface/transformers/pull/38069",
"diff_url": "https://github.com/huggingface/transformers/pull/38069.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38069.patch",
"merged_at": "2025-05-12T15:32:01"
} |
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
The commit (https://github.com/huggingface/transformers/commit/5cf11e5ab9591652ee025069658f9af5a98e455e) fixed the type hints for the parameter `tools` in apply_chat_template, but the docstring was not changed.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38069/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38069/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38068 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38068/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38068/comments | https://api.github.com/repos/huggingface/transformers/issues/38068/events | https://github.com/huggingface/transformers/pull/38068 | 3,054,764,568 | PR_kwDOCUB6oc6VuUxw | 38,068 | Confidenet | {
"login": "Onkarsus13",
"id": 46415298,
"node_id": "MDQ6VXNlcjQ2NDE1Mjk4",
"avatar_url": "https://avatars.githubusercontent.com/u/46415298?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Onkarsus13",
"html_url": "https://github.com/Onkarsus13",
"followers_url": "https://api.github.com/users/Onkarsus13/followers",
"following_url": "https://api.github.com/users/Onkarsus13/following{/other_user}",
"gists_url": "https://api.github.com/users/Onkarsus13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Onkarsus13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Onkarsus13/subscriptions",
"organizations_url": "https://api.github.com/users/Onkarsus13/orgs",
"repos_url": "https://api.github.com/users/Onkarsus13/repos",
"events_url": "https://api.github.com/users/Onkarsus13/events{/privacy}",
"received_events_url": "https://api.github.com/users/Onkarsus13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-11T06:21:37 | 2025-05-11T06:23:44 | 2025-05-11T06:23:44 | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38068",
"html_url": "https://github.com/huggingface/transformers/pull/38068",
"diff_url": "https://github.com/huggingface/transformers/pull/38068.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38068.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Onkarsus13",
"id": 46415298,
"node_id": "MDQ6VXNlcjQ2NDE1Mjk4",
"avatar_url": "https://avatars.githubusercontent.com/u/46415298?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Onkarsus13",
"html_url": "https://github.com/Onkarsus13",
"followers_url": "https://api.github.com/users/Onkarsus13/followers",
"following_url": "https://api.github.com/users/Onkarsus13/following{/other_user}",
"gists_url": "https://api.github.com/users/Onkarsus13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Onkarsus13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Onkarsus13/subscriptions",
"organizations_url": "https://api.github.com/users/Onkarsus13/orgs",
"repos_url": "https://api.github.com/users/Onkarsus13/repos",
"events_url": "https://api.github.com/users/Onkarsus13/events{/privacy}",
"received_events_url": "https://api.github.com/users/Onkarsus13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38068/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38068/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38067 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38067/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38067/comments | https://api.github.com/repos/huggingface/transformers/issues/38067/events | https://github.com/huggingface/transformers/pull/38067 | 3,054,657,577 | PR_kwDOCUB6oc6Vt-rM | 38,067 | Fix bug in prefill_chunk_size that ignores disable_compile flag | {
"login": "xmarva",
"id": 137751976,
"node_id": "U_kgDOCDXtqA",
"avatar_url": "https://avatars.githubusercontent.com/u/137751976?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xmarva",
"html_url": "https://github.com/xmarva",
"followers_url": "https://api.github.com/users/xmarva/followers",
"following_url": "https://api.github.com/users/xmarva/following{/other_user}",
"gists_url": "https://api.github.com/users/xmarva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xmarva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xmarva/subscriptions",
"organizations_url": "https://api.github.com/users/xmarva/orgs",
"repos_url": "https://api.github.com/users/xmarva/repos",
"events_url": "https://api.github.com/users/xmarva/events{/privacy}",
"received_events_url": "https://api.github.com/users/xmarva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-11T02:36:38 | 2025-05-13T13:23:54 | 2025-05-13T13:23:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38067",
"html_url": "https://github.com/huggingface/transformers/pull/38067",
"diff_url": "https://github.com/huggingface/transformers/pull/38067.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38067.patch",
"merged_at": "2025-05-13T13:23:23"
} | This PR fixes a bug in the `prefill_chunking` function where the compilation check is not performed before calling `get_compiled_call()`.
When using `prefill_chunk_size > 0` in a `GenerationConfig`, the model's forward function is always compiled, even if `disable_compile=True` is specified. This happens because the `prefill_chunking` function directly calls `get_compiled_call()` without checking if compilation should occur:
```python
model_forward = self.get_compiled_call(generation_config.compile_config)
```
Modified to:
```python
compile_forward = self._valid_auto_compile_criteria(model_kwargs, generation_config)
if compile_forward:
model_forward = self.get_compiled_call(generation_config.compile_config)
```
This ensures that models aren't compiled when `disable_compile=True is set`.
Also fixed a typo in the error message ("chunkink" -> "chunking"), lol
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the contributor guideline, Pull Request section?
_Yes, but this is still my first PR here, hope it's ok_
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)?
- [x] Did you make sure to update the documentation with your changes?
_As far as I understand it, there's no need to change the documentation_
- [ ] Did you write any new necessary tests?
_No, tested with a simple generation script using prefill_chunk_size=8 and disable_compile=True. Before the fix, torch.compile was being called despite disable_compile=True. After the fix, no compilation occurs as expected._
## Who can review?
@gante
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38067/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38067/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38066 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38066/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38066/comments | https://api.github.com/repos/huggingface/transformers/issues/38066/events | https://github.com/huggingface/transformers/issues/38066 | 3,054,509,355 | I_kwDOCUB6oc62EB0r | 38,066 | `AutoModel.from_pretrained(...)` (with explicit `device_map` unset) fails under `with torch.device("meta")` with PyTorch 2.6.0 and 2.7.0 | {
"login": "vadimkantorov",
"id": 1041752,
"node_id": "MDQ6VXNlcjEwNDE3NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1041752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vadimkantorov",
"html_url": "https://github.com/vadimkantorov",
"followers_url": "https://api.github.com/users/vadimkantorov/followers",
"following_url": "https://api.github.com/users/vadimkantorov/following{/other_user}",
"gists_url": "https://api.github.com/users/vadimkantorov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vadimkantorov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vadimkantorov/subscriptions",
"organizations_url": "https://api.github.com/users/vadimkantorov/orgs",
"repos_url": "https://api.github.com/users/vadimkantorov/repos",
"events_url": "https://api.github.com/users/vadimkantorov/events{/privacy}",
"received_events_url": "https://api.github.com/users/vadimkantorov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T20:35:19 | 2025-08-18T14:15:32 | 2025-07-12T08:03:15 | NONE | null | null | null | null | ```python
# from torch.nn.attention.flex_attention import BlockMask, flex_attention
from transformers import AutoModel
import torch
with torch.device('meta'):
AutoModel.from_pretrained('Qwen/Qwen2.5-0.5B', trust_remote_code=True)
````
I found this code in the wild in https://github.com/Open-Reasoner-Zero/Open-Reasoner-Zero/blob/f6d1ec77ce2ce18f3d925a1014c9e4d6b4ad3072/orz/ppo/actors.py#L745-L746 (linked issue https://github.com/Open-Reasoner-Zero/Open-Reasoner-Zero/issues/71)
fails with:
```
Sliding Window Attention is enabled but not implemented for `sdpa`; unexpected results may be encountered.
---------------------------------------------------------------------------
NotImplementedError Traceback (most recent call last)
[<ipython-input-1-00ba4c43be18>](https://localhost:8080/#) in <cell line: 0>()
4
5 with torch.device('meta'):
----> 6 AutoModel.from_pretrained('Qwen/Qwen2.5-0.5B', trust_remote_code=True)
6 frames
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
569 if model_class.config_class == config.sub_configs.get("text_config", None):
570 config = config.get_text_config()
--> 571 return model_class.from_pretrained(
572 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs
573 )
[/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in _wrapper(*args, **kwargs)
277 old_dtype = torch.get_default_dtype()
278 try:
--> 279 return func(*args, **kwargs)
280 finally:
281 torch.set_default_dtype(old_dtype)
[/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, config, cache_dir, ignore_mismatched_sizes, force_download, local_files_only, token, revision, use_safetensors, weights_only, *model_args, **kwargs)
4397 offload_index,
4398 error_msgs,
-> 4399 ) = cls._load_pretrained_model(
4400 model,
4401 state_dict,
[/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in _load_pretrained_model(cls, model, state_dict, checkpoint_files, pretrained_model_name_or_path, ignore_mismatched_sizes, sharded_metadata, device_map, disk_offload_folder, offload_state_dict, dtype, hf_quantizer, keep_in_fp32_regex, device_mesh, key_mapping, weights_only)
4831 # Skip it with fsdp on ranks other than 0
4832 elif not (is_fsdp_enabled() and not is_local_dist_rank_0() and not is_quantized):
-> 4833 disk_offload_index, cpu_offload_index = _load_state_dict_into_meta_model(
4834 model_to_load,
4835 state_dict,
[/usr/local/lib/python3.11/dist-packages/torch/utils/_contextlib.py](https://localhost:8080/#) in decorate_context(*args, **kwargs)
114 def decorate_context(*args, **kwargs):
115 with ctx_factory():
--> 116 return func(*args, **kwargs)
117
118 return decorate_context
[/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in _load_state_dict_into_meta_model(model, state_dict, shard_file, expected_keys, reverse_renaming_mapping, device_map, disk_offload_folder, disk_offload_index, cpu_offload_folder, cpu_offload_index, hf_quantizer, is_safetensors, keep_in_fp32_regex, unexpected_keys, device_mesh)
822 param_device = "cpu" if is_local_dist_rank_0() else "meta"
823
--> 824 _load_parameter_into_model(model, param_name, param.to(param_device))
825
826 else:
[/usr/local/lib/python3.11/dist-packages/torch/utils/_device.py](https://localhost:8080/#) in __torch_function__(self, func, types, args, kwargs)
102 if func in _device_constructors() and kwargs.get('device') is None:
103 kwargs['device'] = self.device
--> 104 return func(*args, **kwargs)
105
106 # NB: This is directly called from C++ in torch/csrc/Device.cpp
NotImplementedError: Cannot copy out of meta tensor; no data!
```
Also, unless uncommenting the first line, it also fails on 2.6.0 with `RuntimeError: Tensor.item() cannot be called on meta tensors`:
- https://github.com/pytorch/pytorch/issues/153330 | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38066/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38066/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38065 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38065/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38065/comments | https://api.github.com/repos/huggingface/transformers/issues/38065/events | https://github.com/huggingface/transformers/issues/38065 | 3,054,490,539 | I_kwDOCUB6oc62D9Or | 38,065 | [minor] Protect against broken/missing torchvision installations and do not hard-fail at timm/torchvision import (many text models don't need any timm/torchvision as hard dependencies) | {
"login": "vadimkantorov",
"id": 1041752,
"node_id": "MDQ6VXNlcjEwNDE3NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1041752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vadimkantorov",
"html_url": "https://github.com/vadimkantorov",
"followers_url": "https://api.github.com/users/vadimkantorov/followers",
"following_url": "https://api.github.com/users/vadimkantorov/following{/other_user}",
"gists_url": "https://api.github.com/users/vadimkantorov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vadimkantorov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vadimkantorov/subscriptions",
"organizations_url": "https://api.github.com/users/vadimkantorov/orgs",
"repos_url": "https://api.github.com/users/vadimkantorov/repos",
"events_url": "https://api.github.com/users/vadimkantorov/events{/privacy}",
"received_events_url": "https://api.github.com/users/vadimkantorov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-10T19:48:25 | 2025-06-16T11:29:57 | 2025-06-10T08:12:29 | NONE | null | null | null | null | Also, transformers import fails when no torchvision is installed at all. I think it should be no-error, especially if I'm working with text models only. timm/friends should not be imported at all...
### System Info
Google Colab, uses pytorch 2.6.0 for now
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Go to colab which currently still has pytorch 2.6.0, then uninstall pytorch and install new pytorch (without torchvision):
```
!pip uninstall torch -y
!pip install torch --index-url https://download.pytorch.org/whl/cpu
`from transformers import AutoModel; AutoModel.from_pretrained('Qwen/Qwen2.5-0.5B', trust_remote_code=True)`
```
Currently (nothing to do with vision) fails when the setup has broken installation of torchvision (can happen if the torchvision isn't actually used and has a version not matching torch)
```
import transformers.models.timm_wrapper.configuration_timm_wrapper because of the following error (look up to see its traceback):
operator torchvision::nms does not exist
```
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
[/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
1966 try:
-> 1967 return importlib.import_module("." + module_name, self.__name__)
1968 except Exception as e:
24 frames
[/usr/lib/python3.11/importlib/__init__.py](https://localhost:8080/#) in import_module(name, package)
125 level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)
127
/usr/lib/python3.11/importlib/_bootstrap.py in _gcd_import(name, package, level)
/usr/lib/python3.11/importlib/_bootstrap.py in _find_and_load(name, import_)
/usr/lib/python3.11/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)
/usr/lib/python3.11/importlib/_bootstrap.py in _load_unlocked(spec)
/usr/lib/python3.11/importlib/_bootstrap_external.py in exec_module(self, module)
/usr/lib/python3.11/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)
[/usr/local/lib/python3.11/dist-packages/transformers/models/timm_wrapper/configuration_timm_wrapper.py](https://localhost:8080/#) in <module>
24 if is_timm_available():
---> 25 from timm.data import ImageNetInfo, infer_imagenet_subset
26
[/usr/local/lib/python3.11/dist-packages/timm/__init__.py](https://localhost:8080/#) in <module>
1 from .version import __version__ as __version__
----> 2 from .layers import (
3 is_scriptable as is_scriptable,
[/usr/local/lib/python3.11/dist-packages/timm/layers/__init__.py](https://localhost:8080/#) in <module>
7 from .blur_pool import BlurPool2d, create_aa
----> 8 from .classifier import create_classifier, ClassifierHead, NormMlpClassifierHead, ClNormMlpClassifierHead
9 from .cond_conv2d import CondConv2d, get_condconv_initializer
[/usr/local/lib/python3.11/dist-packages/timm/layers/classifier.py](https://localhost:8080/#) in <module>
14 from .create_act import get_act_layer
---> 15 from .create_norm import get_norm_layer
16
[/usr/local/lib/python3.11/dist-packages/timm/layers/create_norm.py](https://localhost:8080/#) in <module>
13 from .norm import GroupNorm, GroupNorm1, LayerNorm, LayerNorm2d, RmsNorm, RmsNorm2d, SimpleNorm, SimpleNorm2d
---> 14 from torchvision.ops.misc import FrozenBatchNorm2d
15
[/usr/local/lib/python3.11/dist-packages/torchvision/__init__.py](https://localhost:8080/#) in <module>
9 from .extension import _HAS_OPS # usort:skip
---> 10 from torchvision import _meta_registrations, datasets, io, models, ops, transforms, utils # usort:skip
11
[/usr/local/lib/python3.11/dist-packages/torchvision/_meta_registrations.py](https://localhost:8080/#) in <module>
162
--> 163 @torch.library.register_fake("torchvision::nms")
164 def meta_nms(dets, scores, iou_threshold):
[/usr/local/lib/python3.11/dist-packages/torch/library.py](https://localhost:8080/#) in register(func)
1022 use_lib = lib
-> 1023 use_lib._register_fake(op_name, func, _stacklevel=stacklevel + 1)
1024 return func
[/usr/local/lib/python3.11/dist-packages/torch/library.py](https://localhost:8080/#) in _register_fake(self, op_name, fn, _stacklevel)
213
--> 214 handle = entry.fake_impl.register(func_to_register, source)
215 self._registration_handles.append(handle)
[/usr/local/lib/python3.11/dist-packages/torch/_library/fake_impl.py](https://localhost:8080/#) in register(self, func, source)
30 )
---> 31 if torch._C._dispatch_has_kernel_for_dispatch_key(self.qualname, "Meta"):
32 raise RuntimeError(
RuntimeError: operator torchvision::nms does not exist
The above exception was the direct cause of the following exception:
RuntimeError Traceback (most recent call last)
[<ipython-input-6-8843fc8bd8f0>](https://localhost:8080/#) in <cell line: 0>()
4 with torch.device('meta'):
5 #from torch.nn.attention.flex_attention import BlockMask, flex_attention
----> 6 AutoModel.from_pretrained('Qwen/Qwen2.5-0.5B', trust_remote_code=True)
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
546
547 has_remote_code = hasattr(config, "auto_map") and cls.__name__ in config.auto_map
--> 548 has_local_code = type(config) in cls._model_mapping.keys()
549 trust_remote_code = resolve_trust_remote_code(
550 trust_remote_code, pretrained_model_name_or_path, has_local_code, has_remote_code
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in keys(self)
785
786 def keys(self):
--> 787 mapping_keys = [
788 self._load_attr_from_module(key, name)
789 for key, name in self._config_mapping.items()
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in <listcomp>(.0)
786 def keys(self):
787 mapping_keys = [
--> 788 self._load_attr_from_module(key, name)
789 for key, name in self._config_mapping.items()
790 if key in self._model_mapping.keys()
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in _load_attr_from_module(self, model_type, attr)
782 if module_name not in self._modules:
783 self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models")
--> 784 return getattribute_from_module(self._modules[module_name], attr)
785
786 def keys(self):
[/usr/local/lib/python3.11/dist-packages/transformers/models/auto/auto_factory.py](https://localhost:8080/#) in getattribute_from_module(module, attr)
698 if isinstance(attr, tuple):
699 return tuple(getattribute_from_module(module, a) for a in attr)
--> 700 if hasattr(module, attr):
701 return getattr(module, attr)
702 # Some of the mappings have entries model_type -> object of another model type. In that case we try to grab the
[/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in __getattr__(self, name)
1953 value = Placeholder
1954 elif name in self._class_to_module.keys():
-> 1955 module = self._get_module(self._class_to_module[name])
1956 value = getattr(module, name)
1957 elif name in self._modules:
[/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py](https://localhost:8080/#) in _get_module(self, module_name)
1967 return importlib.import_module("." + module_name, self.__name__)
1968 except Exception as e:
-> 1969 raise RuntimeError(
1970 f"Failed to import {self.__name__}.{module_name} because of the following error (look up to see its"
1971 f" traceback):\n{e}"
RuntimeError: Failed to import transformers.models.timm_wrapper.configuration_timm_wrapper because of the following error (look up to see its traceback):
operator torchvision::nms does not exist
```
### Expected behavior
no hard-fail at import time, maybe a hard-fail when these ops/modules from torchvision are actually needed at runtime | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38065/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38065/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38064 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38064/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38064/comments | https://api.github.com/repos/huggingface/transformers/issues/38064/events | https://github.com/huggingface/transformers/pull/38064 | 3,054,476,222 | PR_kwDOCUB6oc6Vta9d | 38,064 | Added scores in the streamer classes based on generation flag | {
"login": "LuisCarlos-104171",
"id": 101110269,
"node_id": "U_kgDOBgbR_Q",
"avatar_url": "https://avatars.githubusercontent.com/u/101110269?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LuisCarlos-104171",
"html_url": "https://github.com/LuisCarlos-104171",
"followers_url": "https://api.github.com/users/LuisCarlos-104171/followers",
"following_url": "https://api.github.com/users/LuisCarlos-104171/following{/other_user}",
"gists_url": "https://api.github.com/users/LuisCarlos-104171/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LuisCarlos-104171/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LuisCarlos-104171/subscriptions",
"organizations_url": "https://api.github.com/users/LuisCarlos-104171/orgs",
"repos_url": "https://api.github.com/users/LuisCarlos-104171/repos",
"events_url": "https://api.github.com/users/LuisCarlos-104171/events{/privacy}",
"received_events_url": "https://api.github.com/users/LuisCarlos-104171/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-10T19:12:41 | 2025-05-13T10:47:51 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38064",
"html_url": "https://github.com/huggingface/transformers/pull/38064",
"diff_url": "https://github.com/huggingface/transformers/pull/38064.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38064.patch",
"merged_at": null
} | This PR works on a personal issue I had when creating a JSON formatter.
I needed to know the tokens' scores as soon as they were ready; as such, I created 2 new classes that inherit BaseStreamer to support Immediate token output and forwarded the scores from each generate function into the streamer put method.
## Before submitting
- [ x ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ x ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ x ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ x ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ x ] Did you write any new necessary tests? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38064/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38063 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38063/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38063/comments | https://api.github.com/repos/huggingface/transformers/issues/38063/events | https://github.com/huggingface/transformers/issues/38063 | 3,054,470,178 | I_kwDOCUB6oc62D4Qi | 38,063 | Adding native support to load GGUF models using transformers | {
"login": "sleepingcat4",
"id": 81933585,
"node_id": "MDQ6VXNlcjgxOTMzNTg1",
"avatar_url": "https://avatars.githubusercontent.com/u/81933585?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sleepingcat4",
"html_url": "https://github.com/sleepingcat4",
"followers_url": "https://api.github.com/users/sleepingcat4/followers",
"following_url": "https://api.github.com/users/sleepingcat4/following{/other_user}",
"gists_url": "https://api.github.com/users/sleepingcat4/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sleepingcat4/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sleepingcat4/subscriptions",
"organizations_url": "https://api.github.com/users/sleepingcat4/orgs",
"repos_url": "https://api.github.com/users/sleepingcat4/repos",
"events_url": "https://api.github.com/users/sleepingcat4/events{/privacy}",
"received_events_url": "https://api.github.com/users/sleepingcat4/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-05-10T19:02:53 | 2025-07-31T08:21:08 | null | NONE | null | null | null | null | ### Feature request
GGUF recently became a popular model format that can be used to load ridiculously large models in 2x3090 (48GB) cards. While loading the original models are still better and HF supports 4bit and 8bit but those quant aren't much practical for very large models unlike GGUF.
Llama CPP exists and its quite robust but it's not as flexible as native HF and it does not support batch inference. https://huggingface.co/docs/hub/en/gguf
### Motivation
Actually same as what I described in feature request. I think GGUF is tremendous value and it should included in HF natively. HF supporting the format but doesn't allow it to be loaded using HF is kinda bad.
### Your contribution
I can help with some of the support adding part in my free time. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38063/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38063/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38062 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38062/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38062/comments | https://api.github.com/repos/huggingface/transformers/issues/38062/events | https://github.com/huggingface/transformers/pull/38062 | 3,054,457,959 | PR_kwDOCUB6oc6VtW-d | 38,062 | Fix broken example generation script for Llama3 | {
"login": "sarckk",
"id": 48474650,
"node_id": "MDQ6VXNlcjQ4NDc0NjUw",
"avatar_url": "https://avatars.githubusercontent.com/u/48474650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sarckk",
"html_url": "https://github.com/sarckk",
"followers_url": "https://api.github.com/users/sarckk/followers",
"following_url": "https://api.github.com/users/sarckk/following{/other_user}",
"gists_url": "https://api.github.com/users/sarckk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sarckk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sarckk/subscriptions",
"organizations_url": "https://api.github.com/users/sarckk/orgs",
"repos_url": "https://api.github.com/users/sarckk/repos",
"events_url": "https://api.github.com/users/sarckk/events{/privacy}",
"received_events_url": "https://api.github.com/users/sarckk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T18:49:41 | 2025-05-20T16:39:37 | 2025-05-20T08:53:43 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38062",
"html_url": "https://github.com/huggingface/transformers/pull/38062",
"diff_url": "https://github.com/huggingface/transformers/pull/38062.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38062.patch",
"merged_at": "2025-05-20T08:53:43"
} | # What does this PR do?
Running this script with Llama3 fails with error:
```
...
--> 316 return _sentencepiece.SentencePieceProcessor_LoadFromFile(self, arg)
TypeError: not a string
```
As mentioned in #30607 this is because Llama3 tokenizer is not based on sentencepiece. The fix is to use `AutoTokenizer` which will use `PreTrainedTokenizerFast` for Llama3. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38062/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38061 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38061/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38061/comments | https://api.github.com/repos/huggingface/transformers/issues/38061/events | https://github.com/huggingface/transformers/issues/38061 | 3,054,425,289 | I_kwDOCUB6oc62DtTJ | 38,061 | Weights not initialized correctly when instantiating model with a pretrained backbone | {
"login": "matteot11",
"id": 15927868,
"node_id": "MDQ6VXNlcjE1OTI3ODY4",
"avatar_url": "https://avatars.githubusercontent.com/u/15927868?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matteot11",
"html_url": "https://github.com/matteot11",
"followers_url": "https://api.github.com/users/matteot11/followers",
"following_url": "https://api.github.com/users/matteot11/following{/other_user}",
"gists_url": "https://api.github.com/users/matteot11/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matteot11/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matteot11/subscriptions",
"organizations_url": "https://api.github.com/users/matteot11/orgs",
"repos_url": "https://api.github.com/users/matteot11/repos",
"events_url": "https://api.github.com/users/matteot11/events{/privacy}",
"received_events_url": "https://api.github.com/users/matteot11/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-10T18:04:19 | 2025-07-03T08:02:48 | 2025-07-03T08:02:48 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: macOS-14.4.1-arm64-arm-64bit
- Python version: 3.9.19
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.4.3
- Accelerate version: 1.2.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@amyeroberts
@qubvel
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I am trying to load a Mask2Former model with a pretrained backbone following [this PR](https://github.com/huggingface/transformers/pull/28214).
However, the backbone weights do not appear to be properly initialized when using ```use_pretrained_backbone=True``` in the config. Here's a minimal example:
```
from transformers import (
SwinForImageClassification,
Mask2FormerForUniversalSegmentation,
Mask2FormerConfig,
)
swin_model_name = "microsoft/swin-tiny-patch4-window7-224"
def params_match(params1, params2):
return all([(p1 == p2).all() for p1, p2 in zip(params1, params2)])
# load pretrained swin model
swin_model = SwinForImageClassification.from_pretrained(swin_model_name)
# load Mask2Former with a pretrained swin backbone
config = Mask2FormerConfig(
backbone=swin_model_name,
use_pretrained_backbone=True,
)
m2f = Mask2FormerForUniversalSegmentation(config)
# AssertionError: parameters don't match
assert params_match(
swin_model.base_model.encoder.parameters(),
m2f.model.pixel_level_module.encoder.encoder.parameters(),
)
```
The Swin parameters in Mask2Former do not match those from the separately loaded Swin model, suggesting the backbone was not properly initialized.
However, if I explicitly load the backbone via ```load_backbone``` function, the parameters do match:
```
from transformers.utils.backbone_utils import load_backbone
m2f.model.pixel_level_module.encoder = load_backbone(config)
# Now passes
assert params_match(
swin_model.base_model.encoder.parameters(),
m2f.model.pixel_level_module.encoder.encoder.parameters(),
)
```
Could this be caused by the ```post_init()``` method being called during the instantiation of Mask2Former, even if a pretrained backbone is being loaded?
### Expected behavior
As mentioned before, the backbone should be correctly initialized when specifying ```use_pretrained_backbone=True``` in the config. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38061/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38061/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38060 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38060/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38060/comments | https://api.github.com/repos/huggingface/transformers/issues/38060/events | https://github.com/huggingface/transformers/pull/38060 | 3,054,005,813 | PR_kwDOCUB6oc6Vr01N | 38,060 | Improved cache docs | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T10:33:49 | 2025-05-26T13:53:41 | 2025-05-26T13:53:41 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38060",
"html_url": "https://github.com/huggingface/transformers/pull/38060",
"diff_url": "https://github.com/huggingface/transformers/pull/38060.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38060.patch",
"merged_at": "2025-05-26T13:53:41"
} | Improved documentation for cache. Draft to be improved with the per-layer cache refactor.
EDIT: submitting this for review since the per-layer refactor is a thing on its own. Happy to merge this small improvement. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38060/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38060/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38059 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38059/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38059/comments | https://api.github.com/repos/huggingface/transformers/issues/38059/events | https://github.com/huggingface/transformers/pull/38059 | 3,053,964,484 | PR_kwDOCUB6oc6VrsAF | 38,059 | Add cuda graphs | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T09:38:03 | 2025-05-12T14:58:04 | 2025-05-12T14:58:01 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38059",
"html_url": "https://github.com/huggingface/transformers/pull/38059",
"diff_url": "https://github.com/huggingface/transformers/pull/38059.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38059.patch",
"merged_at": "2025-05-12T14:58:01"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38059/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38059/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38058 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38058/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38058/comments | https://api.github.com/repos/huggingface/transformers/issues/38058/events | https://github.com/huggingface/transformers/pull/38058 | 3,053,958,720 | PR_kwDOCUB6oc6Vrqt8 | 38,058 | [SAM-HQ] Update names in the docs | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/users/NielsRogge/followers",
"following_url": "https://api.github.com/users/NielsRogge/following{/other_user}",
"gists_url": "https://api.github.com/users/NielsRogge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NielsRogge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NielsRogge/subscriptions",
"organizations_url": "https://api.github.com/users/NielsRogge/orgs",
"repos_url": "https://api.github.com/users/NielsRogge/repos",
"events_url": "https://api.github.com/users/NielsRogge/events{/privacy}",
"received_events_url": "https://api.github.com/users/NielsRogge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T09:33:11 | 2025-05-19T16:21:14 | 2025-05-19T16:21:14 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38058",
"html_url": "https://github.com/huggingface/transformers/pull/38058",
"diff_url": "https://github.com/huggingface/transformers/pull/38058.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38058.patch",
"merged_at": "2025-05-19T16:21:14"
} | # What does this PR do?
This PR updates the `repo_id` used in the docs of SAM-HQ (as it was updated to https://huggingface.co/syscv-community/sam-hq-vit-base)
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38058/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38058/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38057 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38057/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38057/comments | https://api.github.com/repos/huggingface/transformers/issues/38057/events | https://github.com/huggingface/transformers/pull/38057 | 3,053,870,116 | PR_kwDOCUB6oc6VrXhc | 38,057 | docs: fix md style | {
"login": "imba-tjd",
"id": 24759802,
"node_id": "MDQ6VXNlcjI0NzU5ODAy",
"avatar_url": "https://avatars.githubusercontent.com/u/24759802?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/imba-tjd",
"html_url": "https://github.com/imba-tjd",
"followers_url": "https://api.github.com/users/imba-tjd/followers",
"following_url": "https://api.github.com/users/imba-tjd/following{/other_user}",
"gists_url": "https://api.github.com/users/imba-tjd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/imba-tjd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/imba-tjd/subscriptions",
"organizations_url": "https://api.github.com/users/imba-tjd/orgs",
"repos_url": "https://api.github.com/users/imba-tjd/repos",
"events_url": "https://api.github.com/users/imba-tjd/events{/privacy}",
"received_events_url": "https://api.github.com/users/imba-tjd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-10T07:49:11 | 2025-05-12T14:56:32 | 2025-05-12T14:56:32 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38057",
"html_url": "https://github.com/huggingface/transformers/pull/38057",
"diff_url": "https://github.com/huggingface/transformers/pull/38057.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38057.patch",
"merged_at": "2025-05-12T14:56:31"
} | trivial

| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38057/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38056 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38056/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38056/comments | https://api.github.com/repos/huggingface/transformers/issues/38056/events | https://github.com/huggingface/transformers/issues/38056 | 3,053,840,078 | I_kwDOCUB6oc62BebO | 38,056 | Qwen/Qwen2.5-VL-7B-Instruct not work [2025-05-10] | {
"login": "kekxv",
"id": 15551108,
"node_id": "MDQ6VXNlcjE1NTUxMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/15551108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kekxv",
"html_url": "https://github.com/kekxv",
"followers_url": "https://api.github.com/users/kekxv/followers",
"following_url": "https://api.github.com/users/kekxv/following{/other_user}",
"gists_url": "https://api.github.com/users/kekxv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kekxv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kekxv/subscriptions",
"organizations_url": "https://api.github.com/users/kekxv/orgs",
"repos_url": "https://api.github.com/users/kekxv/repos",
"events_url": "https://api.github.com/users/kekxv/events{/privacy}",
"received_events_url": "https://api.github.com/users/kekxv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-10T07:10:21 | 2025-05-12T10:14:05 | 2025-05-12T10:14:05 | NONE | null | null | null | null | ### System Info
work for `pip install git+https://github.com/huggingface/transformers@7a3e208892c06a5e278144eaf38c8599a42f53e7`
not work `main`
https://github.com/QwenLM/Qwen2.5-VL/issues/1192
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
https://github.com/QwenLM/Qwen2.5-VL/issues/1192
### Expected behavior
work qwen2.5-vl | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38056/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38055 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38055/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38055/comments | https://api.github.com/repos/huggingface/transformers/issues/38055/events | https://github.com/huggingface/transformers/pull/38055 | 3,053,289,147 | PR_kwDOCUB6oc6VpW-u | 38,055 | SQuat cache implementation | {
"login": "phymhan",
"id": 6815830,
"node_id": "MDQ6VXNlcjY4MTU4MzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6815830?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phymhan",
"html_url": "https://github.com/phymhan",
"followers_url": "https://api.github.com/users/phymhan/followers",
"following_url": "https://api.github.com/users/phymhan/following{/other_user}",
"gists_url": "https://api.github.com/users/phymhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phymhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phymhan/subscriptions",
"organizations_url": "https://api.github.com/users/phymhan/orgs",
"repos_url": "https://api.github.com/users/phymhan/repos",
"events_url": "https://api.github.com/users/phymhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/phymhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-09T22:13:40 | 2025-07-11T15:21:10 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38055",
"html_url": "https://github.com/huggingface/transformers/pull/38055",
"diff_url": "https://github.com/huggingface/transformers/pull/38055.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38055.patch",
"merged_at": null
} | # What does this PR do?
This PR implements our recent work **SQuat** ([arXiv:2503.24358](https://arxiv.org/abs/2503.24358)) for **KV cache quantization** in Transformers. SQuat introduces a method for orthogonally projecting keys to a query subspace, improving the accuracy of quantized attention computation.
Key changes:
- Added a new `cache_implementation="squat"` option for KV cache, in parallel to existing `"quantized"`, with support for both `quanto` and `HQQ` backends.
- Introduced offline or prefilling-time computation of a query subspace, which requires query states. To enable this, I modified the LLaMA model class (as an example) to pass `query_states` and `attention_mask` through `cache_kwargs`.
- Fixed a bug in the prefilling stage when using per-channel quantization and the sequence length is not a multiple of or shorter than `residual_length`.
## Evaluation Results
We evaluated the testing perplexity under different KV cache implementations using a script adapted from the [Hugging Face blog on KV cache quantization](https://huggingface.co/blog/kv-cache-quantization).

## Before submitting
- [x] This PR implements a new feature based on published research.
- [x] I have read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request).
- [x] I have updated the documentation as needed.
- [x] I have written tests for the added functionality.
## Who can review?
- KV cache / quantization: @SunMarc @MekkCyber
- LLaMA model integration: @ArthurZucker
- General review or documentation: @stevhliu | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38055/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38055/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38054 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38054/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38054/comments | https://api.github.com/repos/huggingface/transformers/issues/38054/events | https://github.com/huggingface/transformers/pull/38054 | 3,053,200,004 | PR_kwDOCUB6oc6VpDXW | 38,054 | fix-mask2former-overlapping-annotations | {
"login": "Ahmed-G-ElTaher",
"id": 124341899,
"node_id": "U_kgDOB2lOiw",
"avatar_url": "https://avatars.githubusercontent.com/u/124341899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ahmed-G-ElTaher",
"html_url": "https://github.com/Ahmed-G-ElTaher",
"followers_url": "https://api.github.com/users/Ahmed-G-ElTaher/followers",
"following_url": "https://api.github.com/users/Ahmed-G-ElTaher/following{/other_user}",
"gists_url": "https://api.github.com/users/Ahmed-G-ElTaher/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Ahmed-G-ElTaher/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Ahmed-G-ElTaher/subscriptions",
"organizations_url": "https://api.github.com/users/Ahmed-G-ElTaher/orgs",
"repos_url": "https://api.github.com/users/Ahmed-G-ElTaher/repos",
"events_url": "https://api.github.com/users/Ahmed-G-ElTaher/events{/privacy}",
"received_events_url": "https://api.github.com/users/Ahmed-G-ElTaher/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-09T21:12:05 | 2025-08-27T02:05:21 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38054",
"html_url": "https://github.com/huggingface/transformers/pull/38054",
"diff_url": "https://github.com/huggingface/transformers/pull/38054.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38054.patch",
"merged_at": null
} | ## Pull Request: Add Support for Handling Overlapping Annotations in Mask2Former
### Problem
The current image processing pipeline for Mask2Former doesn't handle overlapping annotations correctly. When annotations overlap, the processing order is arbitrary, causing larger objects to sometimes be overwritten by smaller ones. This can lead to information loss and incorrect segmentation masks.
### Solution
This PR introduces a new function, `convert_segmentation_map_to_binary_masks_sorted`, that addresses this issue by processing object instances in descending order of their area. This ensures that smaller objects consistently appear on top of larger objects, preserving all annotation information in overlapping regions.
The implementation draws inspiration from the approach used in the `coco2masks` function, which sorts annotations by area before processing them. This enhancement is particularly beneficial for datasets with significant object overlaps.
### Changes
A new function `convert_segmentation_map_to_binary_masks_sorted` has been added with the following features:
- Accepts an optional `sort_by_area` parameter (default: `True`).
- Calculates the area of each instance when sorting is enabled.
- Processes instances in descending order of their area.
This new function can serve as a direct replacement for the existing conversion function, providing improved handling of overlapping regions.
### Testing
- Tested with sample COCO annotations containing overlapping objects.
- Verified that all objects are correctly represented in the output masks.
- Confirmed compatibility with existing Mask2Former models.
### Performance Impact
While the area calculation introduces a minor computational overhead, its impact is negligible within the overall processing pipeline. This step occurs during preprocessing rather than during model inference. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38054/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38054/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38053 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38053/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38053/comments | https://api.github.com/repos/huggingface/transformers/issues/38053/events | https://github.com/huggingface/transformers/issues/38053 | 3,052,974,401 | I_kwDOCUB6oc61-LFB | 38,053 | Attention mask for multi-image input in gemma3 | {
"login": "deval281shah",
"id": 20773043,
"node_id": "MDQ6VXNlcjIwNzczMDQz",
"avatar_url": "https://avatars.githubusercontent.com/u/20773043?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/deval281shah",
"html_url": "https://github.com/deval281shah",
"followers_url": "https://api.github.com/users/deval281shah/followers",
"following_url": "https://api.github.com/users/deval281shah/following{/other_user}",
"gists_url": "https://api.github.com/users/deval281shah/gists{/gist_id}",
"starred_url": "https://api.github.com/users/deval281shah/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/deval281shah/subscriptions",
"organizations_url": "https://api.github.com/users/deval281shah/orgs",
"repos_url": "https://api.github.com/users/deval281shah/repos",
"events_url": "https://api.github.com/users/deval281shah/events{/privacy}",
"received_events_url": "https://api.github.com/users/deval281shah/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T19:05:33 | 2025-05-20T15:35:06 | 2025-05-20T15:35:06 | NONE | null | null | null | null | ### System Info
As per the attention mask example in the Gemma3 blog (https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/blog/gemma3/attention-ascii.png), it looks like there is non-causal attention within the image and causal attention across images (i.e., an image does not attend to a future image). However, when running gemma3 generate using transformers (v4.51.3), looks like there is non-causal attention across images.
```
import torch
from transformers import AutoProcessor, Gemma3ForConditionalGeneration
import os
import pickle
import torch._dynamo
torch._dynamo.config.suppress_errors = True
ckpt = "google/gemma-3-4b-it"
model = Gemma3ForConditionalGeneration.from_pretrained(
ckpt, device_map="auto", torch_dtype=torch.float32,
)
processor = AutoProcessor.from_pretrained(ckpt)
messages = [
{
"role": "user",
"content": [
{"type": "text", "text": "First image: "},
{"type": "image", "path": "img1.jpg"},
{"type": "text", "text": "Second image:"},
{"type": "image", "path": "img2.jpg"},
{"type": "text", "text": "Describe all images in single sentence."}
]
}
]
inputs = processor.apply_chat_template(
messages, add_generation_prompt=True, tokenize=True,
return_dict=True, return_tensors="pt"
).to(model.device)
generation = model.generate(**inputs, max_new_tokens=1, return_dict_in_generate=True,output_attentions=True , do_sample=False)
```

### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Code attached
### Expected behavior
Should the attention across image be non-causal or causal?
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38053/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38053/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38052 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38052/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38052/comments | https://api.github.com/repos/huggingface/transformers/issues/38052/events | https://github.com/huggingface/transformers/issues/38052 | 3,052,965,570 | I_kwDOCUB6oc61-I7C | 38,052 | `.to` on a `PreTrainedModel` throws a Pyright type check error. What is the correct way to put a model to the device that does not throw type check errors? | {
"login": "nickeisenberg",
"id": 64921400,
"node_id": "MDQ6VXNlcjY0OTIxNDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/64921400?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nickeisenberg",
"html_url": "https://github.com/nickeisenberg",
"followers_url": "https://api.github.com/users/nickeisenberg/followers",
"following_url": "https://api.github.com/users/nickeisenberg/following{/other_user}",
"gists_url": "https://api.github.com/users/nickeisenberg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nickeisenberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nickeisenberg/subscriptions",
"organizations_url": "https://api.github.com/users/nickeisenberg/orgs",
"repos_url": "https://api.github.com/users/nickeisenberg/repos",
"events_url": "https://api.github.com/users/nickeisenberg/events{/privacy}",
"received_events_url": "https://api.github.com/users/nickeisenberg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T19:01:15 | 2025-06-29T08:03:07 | 2025-06-29T08:03:07 | NONE | null | null | null | null | ### System Info
(venv) nicholas@B367309:tmp(master)$ transformers-cli env
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.51.1
- Platform: Linux-5.10.16.3-microsoft-standard-WSL2-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu126 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA RTX 2000 Ada Generation Laptop GPU
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Here is a small snippet
```python
from transformers.models.auto.modeling_auto import AutoModelForCausalLM
from transformers.models.llama.modeling_llama import LlamaForCausalLM
model = AutoModelForCausalLM.from_pretrained(
"deepseek-ai/deepseek-coder-1.3b-instruct", torch_dtype=torch.float16
)
assert isinstance(model, LlamaForCausalLM)
model.to("cuda:0")
```
This code runs fine and correctly puts the model to the device, however, `Pyright` throws a pre-runtime type check error on the `model.to("cuda:0") call. This is the error,
```plaintext
Pyright: Argument of type "Literal['cuda:0']" cannot be assigned to parameter "self" of
type "LlamaForCausalLM" in function "__call__".
"Literal['cuda:0']" is not assignable to "LlamaForCausalLM" [reportArgumentType]
```
What is the correct way to put a model to the device that will satisfy the type checker?
### Expected behavior
There should be know static type check error when doing `model.to(<device>)` | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38052/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38052/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38051 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38051/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38051/comments | https://api.github.com/repos/huggingface/transformers/issues/38051/events | https://github.com/huggingface/transformers/pull/38051 | 3,052,878,782 | PR_kwDOCUB6oc6Vn92H | 38,051 | [VLM] fix loading issues | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T18:17:50 | 2025-05-12T10:14:04 | 2025-05-12T10:14:04 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38051",
"html_url": "https://github.com/huggingface/transformers/pull/38051",
"diff_url": "https://github.com/huggingface/transformers/pull/38051.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38051.patch",
"merged_at": "2025-05-12T10:14:04"
} | # What does this PR do?
As per title, the recent refactor broke loading for some models. The tests were ran before the last commit when a restriction on model names was added, thus the issue went unnoticed while merging 🥲
fixes https://github.com/huggingface/transformers/issues/38056, and fixes https://github.com/huggingface/transformers/issues/38070 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38051/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38051/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38050 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38050/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38050/comments | https://api.github.com/repos/huggingface/transformers/issues/38050/events | https://github.com/huggingface/transformers/issues/38050 | 3,052,716,004 | I_kwDOCUB6oc619L_k | 38,050 | Removing GenerateMixin inheritance from PreTrainedModel class results in Phi4 load fail | {
"login": "yatindrav",
"id": 28916236,
"node_id": "MDQ6VXNlcjI4OTE2MjM2",
"avatar_url": "https://avatars.githubusercontent.com/u/28916236?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yatindrav",
"html_url": "https://github.com/yatindrav",
"followers_url": "https://api.github.com/users/yatindrav/followers",
"following_url": "https://api.github.com/users/yatindrav/following{/other_user}",
"gists_url": "https://api.github.com/users/yatindrav/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yatindrav/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yatindrav/subscriptions",
"organizations_url": "https://api.github.com/users/yatindrav/orgs",
"repos_url": "https://api.github.com/users/yatindrav/repos",
"events_url": "https://api.github.com/users/yatindrav/events{/privacy}",
"received_events_url": "https://api.github.com/users/yatindrav/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T16:58:40 | 2025-05-26T07:34:38 | 2025-05-10T05:14:14 | NONE | null | null | null | null | ### System Info
transformers==4.52.0.dev0 (Top of the Trunk)
Platform: Rocky Linux
Traceback (most recent call last):
File ".local/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 359, in __getattr__
return super().__getattr__(name) # defer to nn.Module's logic
File ".local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1928, in __getattr__
raise AttributeError(
AttributeError: 'LoraModel' object has no attribute 'prepare_inputs_for_generation'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "nn/phiexp/phi4v1.py", line 9, in <module>
model = AutoModelForCausalLM.from_pretrained(
File ".local/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File ".local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 303, in _wrapper
return func(*args, **kwargs)
File ".local/lib/python3.9/site-packages/transformers/modeling_utils.py", line 4504, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File ".cache/huggingface/modules/transformers_modules/microsoft/Phi-4-multimodal-instruct/33e62acdd07cd7d6635badd529aa0a3467bb9c6a/modeling_phi4mm.py", line 1962, in __init__
peft_model = get_peft_model(self.model, vision_lora_config, adapter_name="vision")
File ".local/lib/python3.9/site-packages/peft/mapping_func.py", line 123, in get_peft_model
return MODEL_TYPE_TO_PEFT_MODEL_MAPPING[peft_config.task_type](
File ".local/lib/python3.9/site-packages/peft/peft_model.py", line 1723, in __init__
self.base_model_prepare_inputs_for_generation = self.base_model.prepare_inputs_for_generation
File ".local/lib/python3.9/site-packages/peft/tuners/lora/model.py", line 363, in __getattr__
return getattr(self.model, name)
File ".local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1928, in __getattr__
raise AttributeError(
AttributeError: 'Phi4MMModel' object has no attribute 'prepare_inputs_for_generation'
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Steps to reproduce:
1. Install transformers top of the trunk and related components
2. Now run any official script
3. Issue will be reproduced (model load will fail with above mentioned trace).
### Expected behavior
It should be loaded without any issue. | {
"login": "yatindrav",
"id": 28916236,
"node_id": "MDQ6VXNlcjI4OTE2MjM2",
"avatar_url": "https://avatars.githubusercontent.com/u/28916236?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yatindrav",
"html_url": "https://github.com/yatindrav",
"followers_url": "https://api.github.com/users/yatindrav/followers",
"following_url": "https://api.github.com/users/yatindrav/following{/other_user}",
"gists_url": "https://api.github.com/users/yatindrav/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yatindrav/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yatindrav/subscriptions",
"organizations_url": "https://api.github.com/users/yatindrav/orgs",
"repos_url": "https://api.github.com/users/yatindrav/repos",
"events_url": "https://api.github.com/users/yatindrav/events{/privacy}",
"received_events_url": "https://api.github.com/users/yatindrav/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38050/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38050/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38049 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38049/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38049/comments | https://api.github.com/repos/huggingface/transformers/issues/38049/events | https://github.com/huggingface/transformers/pull/38049 | 3,052,650,009 | PR_kwDOCUB6oc6VnMRF | 38,049 | Better pipeline type hints ✨ | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8882772041,
"node_id": "LA_kwDOCUB6oc8AAAACEXRYSQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/typing",
"name": "typing",
"color": "DBA272",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-05-09T16:30:29 | 2025-07-17T11:28:55 | 2025-06-13T12:44:07 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38049",
"html_url": "https://github.com/huggingface/transformers/pull/38049",
"diff_url": "https://github.com/huggingface/transformers/pull/38049.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38049.patch",
"merged_at": "2025-06-13T12:44:07"
} | # What does this PR do?
With the power of `@overload` this PR adds several features to make our pipelines more IDE-friendly (and as a bonus more Curosor, Copilot and autocompletion friendly).
### 1. Correctly resolve pipeline `type` in case `task` is specified for pipeline function

As a result, we have **pipeline-specific** docs and method signatures fetched by IDE
<img width="600" alt="Screenshot 2025-05-09 at 17 09 12" src="https://github.com/user-attachments/assets/0e774621-7505-4a52-829f-30fffaac346c" />
### 2. Resolve ambiguities regarding the input and output types for pipelines
For example,
- if we provide one image, we will get a dict output
- if we provide multiple images we will get list of dict outputs

The part of the code is autogenerated and added to the `pipelines/__init__.py` + added CI check / fix | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38049/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38049/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38048 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38048/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38048/comments | https://api.github.com/repos/huggingface/transformers/issues/38048/events | https://github.com/huggingface/transformers/pull/38048 | 3,052,527,739 | PR_kwDOCUB6oc6VmyMi | 38,048 | Fix decoder_bbox_embed bug and add test for GroundingDINO issue37333 | {
"login": "islemyakoubi",
"id": 79903595,
"node_id": "MDQ6VXNlcjc5OTAzNTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/79903595?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/islemyakoubi",
"html_url": "https://github.com/islemyakoubi",
"followers_url": "https://api.github.com/users/islemyakoubi/followers",
"following_url": "https://api.github.com/users/islemyakoubi/following{/other_user}",
"gists_url": "https://api.github.com/users/islemyakoubi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/islemyakoubi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/islemyakoubi/subscriptions",
"organizations_url": "https://api.github.com/users/islemyakoubi/orgs",
"repos_url": "https://api.github.com/users/islemyakoubi/repos",
"events_url": "https://api.github.com/users/islemyakoubi/events{/privacy}",
"received_events_url": "https://api.github.com/users/islemyakoubi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T15:30:56 | 2025-05-12T16:49:42 | 2025-05-12T16:49:42 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38048",
"html_url": "https://github.com/huggingface/transformers/pull/38048",
"diff_url": "https://github.com/huggingface/transformers/pull/38048.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38048.patch",
"merged_at": null
} | ## What does this PR do?
- Fixes a shallow‐copy bug in `GroundingDinoMLPPredictionHead` when `config.decoder_bbox_embed_share=False`:
- Previously all decoder layers shared the same head instance.
- Now each layer builds its own `GroundingDinoMLPPredictionHead` and wraps them in a `nn.ModuleList`.
- Adds a new smoke test script at `tests/models/grounding_dino/test_grounding_dino.py` that:
- Loads `IDEA-Research/grounding-dino-base` via the HF processor and model.
- Runs a simple inference on the Lena image with prompt `"a face"`.
- Prints out the shapes and a sample of `logits` and `pred_boxes` on GPU to verify end-to-end functionality.
## Why is this needed?
Without this change, all decoder layers share the same bbox‐head instance, which breaks fine-tuning and multi-layer refinements.
## Who to tag?
Vision models: @ArthurZucker @Rocketknight1,
| {
"login": "islemyakoubi",
"id": 79903595,
"node_id": "MDQ6VXNlcjc5OTAzNTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/79903595?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/islemyakoubi",
"html_url": "https://github.com/islemyakoubi",
"followers_url": "https://api.github.com/users/islemyakoubi/followers",
"following_url": "https://api.github.com/users/islemyakoubi/following{/other_user}",
"gists_url": "https://api.github.com/users/islemyakoubi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/islemyakoubi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/islemyakoubi/subscriptions",
"organizations_url": "https://api.github.com/users/islemyakoubi/orgs",
"repos_url": "https://api.github.com/users/islemyakoubi/repos",
"events_url": "https://api.github.com/users/islemyakoubi/events{/privacy}",
"received_events_url": "https://api.github.com/users/islemyakoubi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38048/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38048/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38047 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38047/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38047/comments | https://api.github.com/repos/huggingface/transformers/issues/38047/events | https://github.com/huggingface/transformers/pull/38047 | 3,052,445,112 | PR_kwDOCUB6oc6Vmf8v | 38,047 | [`chat`] generate parameterization powered by `GenerationConfig` and UX-related changes | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T14:59:58 | 2025-05-12T13:04:45 | 2025-05-12T13:04:42 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38047",
"html_url": "https://github.com/huggingface/transformers/pull/38047",
"diff_url": "https://github.com/huggingface/transformers/pull/38047.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38047.patch",
"merged_at": "2025-05-12T13:04:42"
} | # What does this PR do?
The main goal of this PR is to enable user-friendly `generate` parameterization. This also facilitates performance-related customization, which will be the focus of a follow-up PR.
After the deprecation cycle, new users typing `transformers chat -h` will be redirected to a (new) [docs intro section to generation arguments](https://moon-ci-docs.huggingface.co/docs/transformers/pr_38047/en/llm_tutorial#common-options), instead of seeing a wall of CLI arguments. For `transformers` power users, `chat` is now usable and can be parameterized without prior knowledge about the CLI. These changes were inspired by the idea that [CLIs should be a conversation](https://clig.dev/#conversation-as-the-norm) and that [they shouldn't drown users in information](https://clig.dev/#saying-just-enough)
More specifically, with this PR:
1. We can accept almost any `generate` flag as a positional argument, present and future, as opposed to being limited to a set of hardcoded flags;
2. We can pass a `generation_config.json`, for power users to pass complex `generate` arguments that may be difficult to specify in a CLI;
3. User chat commands are clearly distinguished from potential chat entries -- they now start with `!`
4. `!status`, a new command, can be used to print state-related information, such as the current `generate` flags
5. `!set` can now be used to set arbitrary `generate` flags
6. `!reset` was removed -- it was providing minimal benefits (relaunching the CLI with the previous command is the same) but it was requiring us to maintain and pass the input state around
7. help is now printed if there is a typo in a user command (e.g. `!stats` -> not a valid command -> prints error and help)
8. (non-chat specific) There is a new [intro section to `generate` args in the docs](https://moon-ci-docs.huggingface.co/docs/transformers/pr_38047/en/llm_tutorial#common-options), allowing a soft-landing into the parameterization of the text generation universe
Example usage:
```
transformers chat Qwen/Qwen2.5-0.5B-Instruct do_sample=False max_new_tokens=10
``` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38047/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38047/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38046 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38046/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38046/comments | https://api.github.com/repos/huggingface/transformers/issues/38046/events | https://github.com/huggingface/transformers/pull/38046 | 3,052,405,485 | PR_kwDOCUB6oc6VmXQB | 38,046 | Fix cache update! | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T14:46:08 | 2025-05-20T14:37:47 | 2025-05-09T15:54:48 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38046",
"html_url": "https://github.com/huggingface/transformers/pull/38046",
"diff_url": "https://github.com/huggingface/transformers/pull/38046.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38046.patch",
"merged_at": "2025-05-09T15:54:48"
} | # What does this PR do?
As per the title. https://github.com/huggingface/transformers/pull/37873 broke the cache update when going beyond the sliding window, see my comment [here](https://github.com/huggingface/transformers/pull/37873#discussion_r2081837193).
This PR fixes it.
This also incorporates the issue mentioned in https://github.com/huggingface/transformers/issues/37574! TLDR, the order of operations here is important as we check strict inequality!!
Correcteness can be verified with
```python
model_id = "google/gemma-2-9b-it"
device = 0
tokenizer = AutoTokenizer.from_pretrained(model_id)
tokenizer.pad_token = tokenizer.eos_token
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.bfloat16, device_map=device)
chat1 = [ # This size + the new tokens is > than sliding window
{"role": "user", "content": "This is a nice place. " * 675 + "\n\nForget about the previous text, and tell me who you are?"},
]
prompt1 = tokenizer.apply_chat_template(chat1, tokenize=False, add_generation_prompt=True)
chat2 = [
{"role": "user", "content": "create a list of at least 10 colors please"},
]
prompt2 = tokenizer.apply_chat_template(chat2, tokenize=False, add_generation_prompt=True)
inputs = tokenizer([prompt1, prompt2], padding=True, return_tensors="pt").to(0 if device == "auto" else device)
print(f"Sliding window: {getattr(model.config, 'sliding_window', None)}")
print(f"Input size: {inputs.input_ids.shape}")
# print(inputs.keys())
cache = "hybrid"
compile_config = CompileConfig(fullgraph=False)
out = model.generate(**inputs, do_sample=False, max_new_tokens=100, cache_implementation=cache, compile_config=compile_config)
text = tokenizer.batch_decode(out[:, inputs.input_ids.shape[-1] :], skip_special_tokens=False)
print("\n\n")
for seq in text:
print("NEW SEQ:")
print(seq)
```
It use to generate correctly for both the sequence > sliding window and the padded sequence, and now generates very badly. This fixes it once and for all.
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38046/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38046/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38045 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38045/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38045/comments | https://api.github.com/repos/huggingface/transformers/issues/38045/events | https://github.com/huggingface/transformers/pull/38045 | 3,052,277,960 | PR_kwDOCUB6oc6Vl8l5 | 38,045 | [fix] sliding window attention mask | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T14:02:30 | 2025-05-20T09:32:19 | 2025-05-20T09:32:19 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38045",
"html_url": "https://github.com/huggingface/transformers/pull/38045",
"diff_url": "https://github.com/huggingface/transformers/pull/38045.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38045.patch",
"merged_at": "2025-05-20T09:32:19"
} | # What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/38002. Sliding mask should not be applied when `config.use_sliding_window = False`
Added a small test as well
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38045/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38045/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38044 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38044/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38044/comments | https://api.github.com/repos/huggingface/transformers/issues/38044/events | https://github.com/huggingface/transformers/pull/38044 | 3,052,195,651 | PR_kwDOCUB6oc6VlqsW | 38,044 | check github actions 3 | {
"login": "ydshieh2",
"id": 183479141,
"node_id": "U_kgDOCu-rZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/183479141?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh2",
"html_url": "https://github.com/ydshieh2",
"followers_url": "https://api.github.com/users/ydshieh2/followers",
"following_url": "https://api.github.com/users/ydshieh2/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh2/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh2/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh2/orgs",
"repos_url": "https://api.github.com/users/ydshieh2/repos",
"events_url": "https://api.github.com/users/ydshieh2/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh2/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T13:31:35 | 2025-05-28T21:12:19 | 2025-05-28T21:12:18 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38044",
"html_url": "https://github.com/huggingface/transformers/pull/38044",
"diff_url": "https://github.com/huggingface/transformers/pull/38044.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38044.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh2",
"id": 183479141,
"node_id": "U_kgDOCu-rZQ",
"avatar_url": "https://avatars.githubusercontent.com/u/183479141?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh2",
"html_url": "https://github.com/ydshieh2",
"followers_url": "https://api.github.com/users/ydshieh2/followers",
"following_url": "https://api.github.com/users/ydshieh2/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh2/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh2/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh2/orgs",
"repos_url": "https://api.github.com/users/ydshieh2/repos",
"events_url": "https://api.github.com/users/ydshieh2/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh2/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38044/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38044/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38043 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38043/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38043/comments | https://api.github.com/repos/huggingface/transformers/issues/38043/events | https://github.com/huggingface/transformers/pull/38043 | 3,052,134,219 | PR_kwDOCUB6oc6Vlc_s | 38,043 | trigger CI | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T13:12:28 | 2025-05-09T13:30:30 | 2025-05-09T13:25:37 | COLLABORATOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38043",
"html_url": "https://github.com/huggingface/transformers/pull/38043",
"diff_url": "https://github.com/huggingface/transformers/pull/38043.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38043.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38043/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38043/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38042 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38042/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38042/comments | https://api.github.com/repos/huggingface/transformers/issues/38042/events | https://github.com/huggingface/transformers/pull/38042 | 3,052,130,421 | PR_kwDOCUB6oc6VlcIk | 38,042 | Fix reduce-labels in BEIT Fast Image Processor | {
"login": "simonreise",
"id": 43753582,
"node_id": "MDQ6VXNlcjQzNzUzNTgy",
"avatar_url": "https://avatars.githubusercontent.com/u/43753582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/simonreise",
"html_url": "https://github.com/simonreise",
"followers_url": "https://api.github.com/users/simonreise/followers",
"following_url": "https://api.github.com/users/simonreise/following{/other_user}",
"gists_url": "https://api.github.com/users/simonreise/gists{/gist_id}",
"starred_url": "https://api.github.com/users/simonreise/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/simonreise/subscriptions",
"organizations_url": "https://api.github.com/users/simonreise/orgs",
"repos_url": "https://api.github.com/users/simonreise/repos",
"events_url": "https://api.github.com/users/simonreise/events{/privacy}",
"received_events_url": "https://api.github.com/users/simonreise/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T13:11:23 | 2025-05-09T15:53:38 | 2025-05-09T15:51:47 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38042",
"html_url": "https://github.com/huggingface/transformers/pull/38042",
"diff_url": "https://github.com/huggingface/transformers/pull/38042.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38042.patch",
"merged_at": "2025-05-09T15:51:46"
} | # What does this PR do?
When running `BeitImageProcessorFast` with `do_reduce_labels=True`, it not only reduces labels in segmentation map, but also applies `reduce_label` to the image itself
`reduce_label` should be applied only to segmentation maps
### Test
```py
import transformers
import torch
processor = transformers.BeitImageProcessorFast(
do_resize=False,
do_center_crop=False,
do_rescale=False,
do_normalize=False,
do_convert_rgb=False,
)
image = torch.zeros([1, 3, 256, 256])
segmap = torch.zeros([1, 256, 256])
batch = processor.preprocess(
images = image,
segmentation_maps = segmap,
return_tensors="pt",
do_reduce_labels=True,
)
```
pixel_values will be 255, should be 0.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@yonigozlan
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38042/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38042/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38041 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38041/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38041/comments | https://api.github.com/repos/huggingface/transformers/issues/38041/events | https://github.com/huggingface/transformers/pull/38041 | 3,051,907,440 | PR_kwDOCUB6oc6VkrVj | 38,041 | Re-Enable `Trigger CircleCI via GitHub Actions when "ready for review" (#37885)` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T11:48:03 | 2025-05-09T14:57:56 | 2025-05-09T14:57:54 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38041",
"html_url": "https://github.com/huggingface/transformers/pull/38041",
"diff_url": "https://github.com/huggingface/transformers/pull/38041.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38041.patch",
"merged_at": "2025-05-09T14:57:54"
} | # What does this PR do?
With `CircleCI-Public/trigger-circleci-pipeline-action@v1.2.0`, we could specify `target-branch` 🎉 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38041/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38041/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38040 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38040/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38040/comments | https://api.github.com/repos/huggingface/transformers/issues/38040/events | https://github.com/huggingface/transformers/issues/38040 | 3,051,851,905 | I_kwDOCUB6oc6155CB | 38,040 | Modernbert 3D attention mask | {
"login": "meetdoshi-iitb",
"id": 109983579,
"node_id": "U_kgDOBo43Ww",
"avatar_url": "https://avatars.githubusercontent.com/u/109983579?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/meetdoshi-iitb",
"html_url": "https://github.com/meetdoshi-iitb",
"followers_url": "https://api.github.com/users/meetdoshi-iitb/followers",
"following_url": "https://api.github.com/users/meetdoshi-iitb/following{/other_user}",
"gists_url": "https://api.github.com/users/meetdoshi-iitb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/meetdoshi-iitb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/meetdoshi-iitb/subscriptions",
"organizations_url": "https://api.github.com/users/meetdoshi-iitb/orgs",
"repos_url": "https://api.github.com/users/meetdoshi-iitb/repos",
"events_url": "https://api.github.com/users/meetdoshi-iitb/events{/privacy}",
"received_events_url": "https://api.github.com/users/meetdoshi-iitb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-05-09T11:21:56 | 2025-07-16T09:24:53 | null | NONE | null | null | null | null | ### Feature request
Support request for passing a custom 3D attention mask to a modernbert model.
Currently it only supports 2D attention mask of bs, seq_len [modeling_modernbert.py#L859](https://github.com/huggingface/transformers/blob/774dc274ac966f4bccbcd90d55bba23f6cca37ae/src/transformers/models/modernbert/modeling_modernbert.py#L859)
Unlike bert which supports 3D attention mask as well [modeling_bert.py#L988](https://github.com/huggingface/transformers/blob/774dc274ac966f4bccbcd90d55bba23f6cca37ae/src/transformers/models/bert/modeling_bert.py#L988)
### Motivation
Linked issues: [transformers/issues/27640](https://github.com/huggingface/transformers/issues/27640)
### Your contribution
NA | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38040/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/38039 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38039/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38039/comments | https://api.github.com/repos/huggingface/transformers/issues/38039/events | https://github.com/huggingface/transformers/issues/38039 | 3,051,824,880 | I_kwDOCUB6oc615ybw | 38,039 | Trainer API doesnt stop after the training has been completed | {
"login": "Awaisn25",
"id": 64905568,
"node_id": "MDQ6VXNlcjY0OTA1NTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/64905568?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Awaisn25",
"html_url": "https://github.com/Awaisn25",
"followers_url": "https://api.github.com/users/Awaisn25/followers",
"following_url": "https://api.github.com/users/Awaisn25/following{/other_user}",
"gists_url": "https://api.github.com/users/Awaisn25/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Awaisn25/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Awaisn25/subscriptions",
"organizations_url": "https://api.github.com/users/Awaisn25/orgs",
"repos_url": "https://api.github.com/users/Awaisn25/repos",
"events_url": "https://api.github.com/users/Awaisn25/events{/privacy}",
"received_events_url": "https://api.github.com/users/Awaisn25/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T11:09:40 | 2025-06-17T08:02:46 | 2025-06-17T08:02:46 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.51.3
- Platform: Linux-6.11.0-25-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu126 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: Trainer API's parallelism
- Using GPU in script?: Yes
- GPU type: Quadro RTX 5000 with Max-Q Design
### Who can help?
@SunMarc @zach-huggingface
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Use `Seq2SeqTrainingArguments`, with `save_total_limit=4`, `batch_szie=12`, `epochs=8`, `push_to_hub=True`. The task was summarization, the model used was google/mt5-small and the dataset was xlsum-en-es
2. Pass the training args to `Seq2SeqTrainer` with custom `compute_metrics`.
### Expected behavior
I followed the summarization topic in Chapter 7 of LLM Course with the dataset of my own choice. The training continued overnight for about 14 hours. After the evaluation was completed for 8th epoch the trainer continued running for another hours. After I force stopped the execution, I got the following error trace:
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
Cell In[27], line 1
----> 1 trainer.train()
File ~/.pyenv/versions/llm/lib/python3.12/site-packages/transformers/trainer.py:2236, in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
2233 try:
2234 # Disable progress bars when uploading models during checkpoints to avoid polluting stdout
2235 hf_hub_utils.disable_progress_bars()
-> 2236 return inner_training_loop(
2237 args=args,
2238 resume_from_checkpoint=resume_from_checkpoint,
2239 trial=trial,
2240 ignore_keys_for_eval=ignore_keys_for_eval,
2241 )
2242 finally:
2243 hf_hub_utils.enable_progress_bars()
File ~/.pyenv/versions/llm/lib/python3.12/site-packages/transformers/trainer.py:2728, in Trainer._inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
2725 self.control = self.callback_handler.on_train_end(args, self.state, self.control)
2727 # Wait for the checkpoint to be uploaded.
-> 2728 self._finish_current_push()
2730 # After training we make sure to retrieve back the original forward pass method
2731 # for the embedding layer by removing the forward post hook.
2732 if self.neftune_noise_alpha is not None:
File ~/.pyenv/versions/llm/lib/python3.12/site-packages/transformers/trainer.py:4773, in Trainer._finish_current_push(self)
4771 if self.push_in_progress is not None and not self.push_in_progress.is_done():
4772 logger.info("Waiting for the current checkpoint push to be finished, this might take a couple of minutes.")
-> 4773 self.push_in_progress.wait_until_done()
File ~/.pyenv/versions/llm/lib/python3.12/site-packages/transformers/utils/hub.py:1185, in PushInProgress.wait_until_done(self)
1184 def wait_until_done(self):
-> 1185 futures.wait(self.jobs)
File /usr/lib/python3.12/concurrent/futures/_base.py:305, in wait(fs, timeout, return_when)
301 return DoneAndNotDoneFutures(done, not_done)
303 waiter = _create_and_install_waiters(fs, return_when)
--> 305 waiter.event.wait(timeout)
306 for f in fs:
307 with f._condition:
File /usr/lib/python3.12/threading.py:655, in Event.wait(self, timeout)
653 signaled = self._flag
654 if not signaled:
--> 655 signaled = self._cond.wait(timeout)
656 return signaled
File /usr/lib/python3.12/threading.py:355, in Condition.wait(self, timeout)
353 try: # restore state no matter what (e.g., KeyboardInterrupt)
354 if timeout is None:
--> 355 waiter.acquire()
356 gotit = True
357 else:
KeyboardInterrupt:
----------------------------------------------------
The last trace suggests that the thread kept waiting and the lock was not acquired. Smaller training runs were completed without issue. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38039/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38039/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38038 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38038/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38038/comments | https://api.github.com/repos/huggingface/transformers/issues/38038/events | https://github.com/huggingface/transformers/pull/38038 | 3,051,641,934 | PR_kwDOCUB6oc6Vjx4x | 38,038 | Disable `Trigger CircleCI via GitHub Actions when `ready for review` | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T09:50:28 | 2025-05-09T10:27:55 | 2025-05-09T10:27:54 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38038",
"html_url": "https://github.com/huggingface/transformers/pull/38038",
"diff_url": "https://github.com/huggingface/transformers/pull/38038.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38038.patch",
"merged_at": "2025-05-09T10:27:54"
} | # What does this PR do?
In #37885, we use a Github Action workflow file to trigger CircleCI via API. We use `pull_requesttarget` event to allow the Github Actions workflow being triggered without an approval, but this leads to the triggered CircleCI checkout to the `main` branch, which is the wrong target to test against.
This PR disable that workflow for now so we don't get wrong and confusing CI results. I will think if we have other options.
| {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38038/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38037 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38037/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38037/comments | https://api.github.com/repos/huggingface/transformers/issues/38037/events | https://github.com/huggingface/transformers/pull/38037 | 3,051,341,451 | PR_kwDOCUB6oc6ViwvX | 38,037 | Fix an error in STOPPING_CRITERIA_INPUTS_DOCSTRING | {
"login": "bilibili12433014",
"id": 73748897,
"node_id": "MDQ6VXNlcjczNzQ4ODk3",
"avatar_url": "https://avatars.githubusercontent.com/u/73748897?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bilibili12433014",
"html_url": "https://github.com/bilibili12433014",
"followers_url": "https://api.github.com/users/bilibili12433014/followers",
"following_url": "https://api.github.com/users/bilibili12433014/following{/other_user}",
"gists_url": "https://api.github.com/users/bilibili12433014/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bilibili12433014/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bilibili12433014/subscriptions",
"organizations_url": "https://api.github.com/users/bilibili12433014/orgs",
"repos_url": "https://api.github.com/users/bilibili12433014/repos",
"events_url": "https://api.github.com/users/bilibili12433014/events{/privacy}",
"received_events_url": "https://api.github.com/users/bilibili12433014/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T07:57:25 | 2025-05-12T02:27:58 | 2025-05-12T02:27:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38037",
"html_url": "https://github.com/huggingface/transformers/pull/38037",
"diff_url": "https://github.com/huggingface/transformers/pull/38037.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38037.patch",
"merged_at": null
} | # What does this PR do?
There is an error in the comment `STOPPING_CRITERIA_INPUTS_DOCSTRING` for `StoppingCriteria`:
```
`True` indicates we should continue.
```
Should be
```
`False` indicates we should continue.
```
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@gante @stevhliu
| {
"login": "bilibili12433014",
"id": 73748897,
"node_id": "MDQ6VXNlcjczNzQ4ODk3",
"avatar_url": "https://avatars.githubusercontent.com/u/73748897?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bilibili12433014",
"html_url": "https://github.com/bilibili12433014",
"followers_url": "https://api.github.com/users/bilibili12433014/followers",
"following_url": "https://api.github.com/users/bilibili12433014/following{/other_user}",
"gists_url": "https://api.github.com/users/bilibili12433014/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bilibili12433014/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bilibili12433014/subscriptions",
"organizations_url": "https://api.github.com/users/bilibili12433014/orgs",
"repos_url": "https://api.github.com/users/bilibili12433014/repos",
"events_url": "https://api.github.com/users/bilibili12433014/events{/privacy}",
"received_events_url": "https://api.github.com/users/bilibili12433014/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38037/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38037/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38036 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38036/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38036/comments | https://api.github.com/repos/huggingface/transformers/issues/38036/events | https://github.com/huggingface/transformers/pull/38036 | 3,051,278,321 | PR_kwDOCUB6oc6VikJf | 38,036 | enable finegrained_fp8 and granite_speech cases on XPU | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T07:30:12 | 2025-05-14T23:02:42 | 2025-05-14T08:58:41 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38036",
"html_url": "https://github.com/huggingface/transformers/pull/38036",
"diff_url": "https://github.com/huggingface/transformers/pull/38036.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38036.patch",
"merged_at": "2025-05-14T08:58:40"
} | @ydshieh @IlyasMoutawwakil , pls help review, thx | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38036/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38036/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38035 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38035/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38035/comments | https://api.github.com/repos/huggingface/transformers/issues/38035/events | https://github.com/huggingface/transformers/issues/38035 | 3,051,150,804 | I_kwDOCUB6oc613N3U | 38,035 | Setting average_tokens_across_devices to True caused an error because it attempted to gather CPU tensors using NCCL. | {
"login": "Jintao-Huang",
"id": 45290347,
"node_id": "MDQ6VXNlcjQ1MjkwMzQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/45290347?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jintao-Huang",
"html_url": "https://github.com/Jintao-Huang",
"followers_url": "https://api.github.com/users/Jintao-Huang/followers",
"following_url": "https://api.github.com/users/Jintao-Huang/following{/other_user}",
"gists_url": "https://api.github.com/users/Jintao-Huang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jintao-Huang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jintao-Huang/subscriptions",
"organizations_url": "https://api.github.com/users/Jintao-Huang/orgs",
"repos_url": "https://api.github.com/users/Jintao-Huang/repos",
"events_url": "https://api.github.com/users/Jintao-Huang/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jintao-Huang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T06:48:58 | 2025-05-18T08:40:17 | 2025-05-18T08:40:05 | CONTRIBUTOR | null | null | null | null | ERROR: type should be string, got "\n\nhttps://github.com/huggingface/transformers/blob/1dfad4beb2273b82e91c45a9cb2511028d4e0e24/src/transformers/trainer.py#L5288-L5297\n\n" | {
"login": "Jintao-Huang",
"id": 45290347,
"node_id": "MDQ6VXNlcjQ1MjkwMzQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/45290347?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jintao-Huang",
"html_url": "https://github.com/Jintao-Huang",
"followers_url": "https://api.github.com/users/Jintao-Huang/followers",
"following_url": "https://api.github.com/users/Jintao-Huang/following{/other_user}",
"gists_url": "https://api.github.com/users/Jintao-Huang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jintao-Huang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jintao-Huang/subscriptions",
"organizations_url": "https://api.github.com/users/Jintao-Huang/orgs",
"repos_url": "https://api.github.com/users/Jintao-Huang/repos",
"events_url": "https://api.github.com/users/Jintao-Huang/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jintao-Huang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38035/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/38035/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38034 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38034/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38034/comments | https://api.github.com/repos/huggingface/transformers/issues/38034/events | https://github.com/huggingface/transformers/issues/38034 | 3,050,820,550 | I_kwDOCUB6oc6119PG | 38,034 | transformers require torch >= 2.1.0 to run fp8 model, but im using 2.7.0 | {
"login": "O5-7",
"id": 40710644,
"node_id": "MDQ6VXNlcjQwNzEwNjQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/40710644?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/O5-7",
"html_url": "https://github.com/O5-7",
"followers_url": "https://api.github.com/users/O5-7/followers",
"following_url": "https://api.github.com/users/O5-7/following{/other_user}",
"gists_url": "https://api.github.com/users/O5-7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/O5-7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/O5-7/subscriptions",
"organizations_url": "https://api.github.com/users/O5-7/orgs",
"repos_url": "https://api.github.com/users/O5-7/repos",
"events_url": "https://api.github.com/users/O5-7/events{/privacy}",
"received_events_url": "https://api.github.com/users/O5-7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T04:57:05 | 2025-05-12T15:39:41 | 2025-05-12T15:39:41 | NONE | null | null | null | null | ### System Info
python = 3.9
torch = 2.7.0+cu128
transformers = 4.51.3
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
download Qwen3-1.7B-FP8
run [quick start](https://huggingface.co/Qwen/Qwen3-1.7B-FP8#quickstart) with local model
the result is :
```
File "D:\anaconda3\lib\site-packages\transformers\models\auto\auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
File "D:\anaconda3\lib\site-packages\transformers\modeling_utils.py", line 279, in _wrapper
return func(*args, **kwargs)
File "D:\anaconda3\lib\site-packages\transformers\modeling_utils.py", line 4228, in from_pretrained
hf_quantizer.validate_environment(
File "D:\anaconda3\lib\site-packages\transformers\quantizers\quantizer_finegrained_fp8.py", line 36, in validate_environment
raise ImportError(
ImportError: Using fp8 quantization requires torch >= 2.1.0Please install the latest version of torch ( pip install --upgrade torch )
```
Im using torch=2.7.0 , I will try lower version lately
### Expected behavior
no error | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38034/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38034/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38033 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38033/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38033/comments | https://api.github.com/repos/huggingface/transformers/issues/38033/events | https://github.com/huggingface/transformers/issues/38033 | 3,050,714,135 | I_kwDOCUB6oc611jQX | 38,033 | RuntimeError when loading InternVL3-14B model: Embedding size mismatch | {
"login": "wkzcml-1",
"id": 92250356,
"node_id": "U_kgDOBX-g9A",
"avatar_url": "https://avatars.githubusercontent.com/u/92250356?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wkzcml-1",
"html_url": "https://github.com/wkzcml-1",
"followers_url": "https://api.github.com/users/wkzcml-1/followers",
"following_url": "https://api.github.com/users/wkzcml-1/following{/other_user}",
"gists_url": "https://api.github.com/users/wkzcml-1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wkzcml-1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wkzcml-1/subscriptions",
"organizations_url": "https://api.github.com/users/wkzcml-1/orgs",
"repos_url": "https://api.github.com/users/wkzcml-1/repos",
"events_url": "https://api.github.com/users/wkzcml-1/events{/privacy}",
"received_events_url": "https://api.github.com/users/wkzcml-1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-09T04:15:50 | 2025-05-12T03:37:19 | 2025-05-12T03:37:19 | NONE | null | null | null | null | ## Problem Description
When trying to load the [InternVL3-14B](https://huggingface.co/OpenGVLab/InternVL3-14B) model using the `transformers` library, I encountered the following error:
```
RuntimeError: Error(s) in loading state_dict for Embedding:
size mismatch for weight: copying a param with shape torch.Size([151674, 5120]) from checkpoint, the shape in current model is torch.Size([151936, 4096]).
```
## Additional Environment Information
### Transformers Package Details:
```
Name: transformers
Version: 4.52.0.dev0
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /home/tiger/.local/lib/python3.11/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: peft
```
### Python version
```
Python 3.11.2
```
### Code Snippet Used
```python3
model = InternVLForConditionalGeneration.from_pretrained(
internvl3_14B_dir,
trust_remote_code=True
)
``` | {
"login": "wkzcml-1",
"id": 92250356,
"node_id": "U_kgDOBX-g9A",
"avatar_url": "https://avatars.githubusercontent.com/u/92250356?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wkzcml-1",
"html_url": "https://github.com/wkzcml-1",
"followers_url": "https://api.github.com/users/wkzcml-1/followers",
"following_url": "https://api.github.com/users/wkzcml-1/following{/other_user}",
"gists_url": "https://api.github.com/users/wkzcml-1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wkzcml-1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wkzcml-1/subscriptions",
"organizations_url": "https://api.github.com/users/wkzcml-1/orgs",
"repos_url": "https://api.github.com/users/wkzcml-1/repos",
"events_url": "https://api.github.com/users/wkzcml-1/events{/privacy}",
"received_events_url": "https://api.github.com/users/wkzcml-1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38033/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38033/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38032 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38032/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38032/comments | https://api.github.com/repos/huggingface/transformers/issues/38032/events | https://github.com/huggingface/transformers/issues/38032 | 3,050,443,945 | I_kwDOCUB6oc610hSp | 38,032 | Removing the modification of loss value due to rounding off to 4 digits | {
"login": "harish6696",
"id": 45898676,
"node_id": "MDQ6VXNlcjQ1ODk4Njc2",
"avatar_url": "https://avatars.githubusercontent.com/u/45898676?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harish6696",
"html_url": "https://github.com/harish6696",
"followers_url": "https://api.github.com/users/harish6696/followers",
"following_url": "https://api.github.com/users/harish6696/following{/other_user}",
"gists_url": "https://api.github.com/users/harish6696/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harish6696/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harish6696/subscriptions",
"organizations_url": "https://api.github.com/users/harish6696/orgs",
"repos_url": "https://api.github.com/users/harish6696/repos",
"events_url": "https://api.github.com/users/harish6696/events{/privacy}",
"received_events_url": "https://api.github.com/users/harish6696/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-09T00:42:01 | 2025-07-12T08:03:18 | 2025-07-12T08:03:18 | NONE | null | null | null | null | ### System Info
transformers version: 4.50.2
python version: 3.13.1
### Who can help?
@zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Inside the Trainer class, why is the loss rounded to 4 digits? I have applications where I am interested to see the loss go below 4 significant digits, but they all get rounded to 0. Please let the user set this rounding number, or let the loss be displayed in scientific notation like 1.xxxe^y
this is set inside def _maybe_log_save_evaluate()
logs["loss"] = round(tr_loss_scalar / (self.state.global_step - self._globalstep_last_logged), 4)
### Expected behavior
It would be great if this hardcoding of rounding off the "loss" is removed. Best output would be to remove the round function and log the loss value in scientific notation which is much clearer. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38032/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38032/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38031 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38031/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38031/comments | https://api.github.com/repos/huggingface/transformers/issues/38031/events | https://github.com/huggingface/transformers/pull/38031 | 3,050,399,932 | PR_kwDOCUB6oc6VgiuJ | 38,031 | Fix: Use deep copy for bbox_embed layers when decoder_bbox_embed_share is False | {
"login": "byteakp",
"id": 116947899,
"node_id": "U_kgDOBvh7uw",
"avatar_url": "https://avatars.githubusercontent.com/u/116947899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/byteakp",
"html_url": "https://github.com/byteakp",
"followers_url": "https://api.github.com/users/byteakp/followers",
"following_url": "https://api.github.com/users/byteakp/following{/other_user}",
"gists_url": "https://api.github.com/users/byteakp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/byteakp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/byteakp/subscriptions",
"organizations_url": "https://api.github.com/users/byteakp/orgs",
"repos_url": "https://api.github.com/users/byteakp/repos",
"events_url": "https://api.github.com/users/byteakp/events{/privacy}",
"received_events_url": "https://api.github.com/users/byteakp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-08T23:57:35 | 2025-05-09T00:00:31 | 2025-05-09T00:00:31 | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38031",
"html_url": "https://github.com/huggingface/transformers/pull/38031",
"diff_url": "https://github.com/huggingface/transformers/pull/38031.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38031.patch",
"merged_at": null
} | # What does this PR do?
This PR fixes a bug in the `GroundingDino` model implementation related to how the `bbox_embed` layers are instantiated when `decoder_bbox_embed_share=False`.
In the previous code, the same `GroundingDinoMLPPredictionHead` instance was reused across decoder layers, resulting in a shallow copy. This behavior is unintended when `decoder_bbox_embed_share` is set to `False`, as each decoder layer should have its own unique prediction head.
The fix ensures a deep copy by creating a new instance of the `GroundingDinoMLPPredictionHead` for each decoder layer.
Fixes #37333
## Motivation
This bug prevents proper support for models like `LLMDet`, which rely on separate bbox prediction heads per decoder layer. Fixing this improves extensibility and aligns with expected behavior.
## Before submitting
- [x] I have read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request).
- [x] This PR was discussed in Issue #37333.
- [x] I’ve made the necessary code changes in `modeling_grounding_dino.py`.
- [x] This fix improves model compatibility with `LLMDet` variants.
## Who can review?
Vision models: @qubvel, @NielsRogge
| {
"login": "byteakp",
"id": 116947899,
"node_id": "U_kgDOBvh7uw",
"avatar_url": "https://avatars.githubusercontent.com/u/116947899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/byteakp",
"html_url": "https://github.com/byteakp",
"followers_url": "https://api.github.com/users/byteakp/followers",
"following_url": "https://api.github.com/users/byteakp/following{/other_user}",
"gists_url": "https://api.github.com/users/byteakp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/byteakp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/byteakp/subscriptions",
"organizations_url": "https://api.github.com/users/byteakp/orgs",
"repos_url": "https://api.github.com/users/byteakp/repos",
"events_url": "https://api.github.com/users/byteakp/events{/privacy}",
"received_events_url": "https://api.github.com/users/byteakp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38031/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38031/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38030 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38030/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38030/comments | https://api.github.com/repos/huggingface/transformers/issues/38030/events | https://github.com/huggingface/transformers/pull/38030 | 3,050,338,132 | PR_kwDOCUB6oc6VgWtj | 38,030 | Add `TemplateConstraint` and `OrdredConstraint` features (#27706) | {
"login": "jychen630",
"id": 36392136,
"node_id": "MDQ6VXNlcjM2MzkyMTM2",
"avatar_url": "https://avatars.githubusercontent.com/u/36392136?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jychen630",
"html_url": "https://github.com/jychen630",
"followers_url": "https://api.github.com/users/jychen630/followers",
"following_url": "https://api.github.com/users/jychen630/following{/other_user}",
"gists_url": "https://api.github.com/users/jychen630/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jychen630/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jychen630/subscriptions",
"organizations_url": "https://api.github.com/users/jychen630/orgs",
"repos_url": "https://api.github.com/users/jychen630/repos",
"events_url": "https://api.github.com/users/jychen630/events{/privacy}",
"received_events_url": "https://api.github.com/users/jychen630/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-05-08T23:20:24 | 2025-05-13T13:10:43 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38030",
"html_url": "https://github.com/huggingface/transformers/pull/38030",
"diff_url": "https://github.com/huggingface/transformers/pull/38030.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38030.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
According to issue #27706, `TemplateConstraint` is a useful constraint during beam search for constrained decoding. It's desired by the community. In response to this feature request, we implemented the `TemplateConstraint and OrderedConstraint` classes in `src/transformers/generation/beam_constraints.py`. We integrated the constraint workflow with `model.generate()` via corresponding logits processors in `src/transformers/generations/logits_process.py` and test cases in `tests/generation/test_beam_constraints.py`.
Contributors:
Junyao (@jychen630), Raavi (@raavi02) and Pranitha (@NPranitha)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Hello! We noticed @ArthurZucker and @gante 's discussion around template constraints in the issue #27706 - may you please kindly review this a bit? Thanks!
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38030/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38030/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/38029 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38029/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38029/comments | https://api.github.com/repos/huggingface/transformers/issues/38029/events | https://github.com/huggingface/transformers/pull/38029 | 3,050,246,743 | PR_kwDOCUB6oc6VgHP5 | 38,029 | Update Loss Functions to Accept Tensor num_items_in_batch | {
"login": "NEREUScode",
"id": 174478950,
"node_id": "U_kgDOCmZWZg",
"avatar_url": "https://avatars.githubusercontent.com/u/174478950?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NEREUScode",
"html_url": "https://github.com/NEREUScode",
"followers_url": "https://api.github.com/users/NEREUScode/followers",
"following_url": "https://api.github.com/users/NEREUScode/following{/other_user}",
"gists_url": "https://api.github.com/users/NEREUScode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NEREUScode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NEREUScode/subscriptions",
"organizations_url": "https://api.github.com/users/NEREUScode/orgs",
"repos_url": "https://api.github.com/users/NEREUScode/repos",
"events_url": "https://api.github.com/users/NEREUScode/events{/privacy}",
"received_events_url": "https://api.github.com/users/NEREUScode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-08T22:46:56 | 2025-06-02T19:04:39 | 2025-06-02T09:31:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38029",
"html_url": "https://github.com/huggingface/transformers/pull/38029",
"diff_url": "https://github.com/huggingface/transformers/pull/38029.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38029.patch",
"merged_at": "2025-06-02T09:31:44"
} | This PR updates the ForCausalLMLoss and ForMaskedLMLoss functions to handle num_items_in_batch as a torch.Tensor instead of an int, aligning with how the parameter is actually passed during training.
Changes:
Updated num_items_in_batch: Optional[int] → Optional[torch.Tensor] in both functions.
Applied .item() conversion when computing loss to retain compatibility with fixed_cross_entropy.
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38029/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 2,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38029/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38028 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38028/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38028/comments | https://api.github.com/repos/huggingface/transformers/issues/38028/events | https://github.com/huggingface/transformers/issues/38028 | 3,050,087,597 | I_kwDOCUB6oc61zKSt | 38,028 | bug in new prefill_chunk_size implementation | {
"login": "SmerkyG",
"id": 8826350,
"node_id": "MDQ6VXNlcjg4MjYzNTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8826350?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SmerkyG",
"html_url": "https://github.com/SmerkyG",
"followers_url": "https://api.github.com/users/SmerkyG/followers",
"following_url": "https://api.github.com/users/SmerkyG/following{/other_user}",
"gists_url": "https://api.github.com/users/SmerkyG/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SmerkyG/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SmerkyG/subscriptions",
"organizations_url": "https://api.github.com/users/SmerkyG/orgs",
"repos_url": "https://api.github.com/users/SmerkyG/repos",
"events_url": "https://api.github.com/users/SmerkyG/events{/privacy}",
"received_events_url": "https://api.github.com/users/SmerkyG/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-08T21:04:35 | 2025-05-31T15:43:03 | 2025-05-31T15:43:01 | NONE | null | null | null | null | ### System Info
Hi, I noticed a bug in the new chunked prefill code: the implementation does not check whether or not the forward method is compilable, as is done elsewhere in the code.
Specifically, https://github.com/huggingface/transformers/blob/d231f5a7d4d110fffe91ba31b4995c8574036c14/src/transformers/generation/utils.py#L4910
should include compilability checking code like found at https://github.com/huggingface/transformers/blob/d231f5a7d4d110fffe91ba31b4995c8574036c14/src/transformers/generation/utils.py#L3437
This error was encountered on Version: 4.51.3 but the code links above reference the main branch. Thanks!
### Who can help?
@gante
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Any model that should not be compiled will be compiled when using GenerationConfig that contains prefill_chunk_size > 0, even if using disable_compile=True
### Expected behavior
disable_compile=True should cause the model NOT to get compiled, even when using prefill_chunk_size | {
"login": "SmerkyG",
"id": 8826350,
"node_id": "MDQ6VXNlcjg4MjYzNTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8826350?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SmerkyG",
"html_url": "https://github.com/SmerkyG",
"followers_url": "https://api.github.com/users/SmerkyG/followers",
"following_url": "https://api.github.com/users/SmerkyG/following{/other_user}",
"gists_url": "https://api.github.com/users/SmerkyG/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SmerkyG/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SmerkyG/subscriptions",
"organizations_url": "https://api.github.com/users/SmerkyG/orgs",
"repos_url": "https://api.github.com/users/SmerkyG/repos",
"events_url": "https://api.github.com/users/SmerkyG/events{/privacy}",
"received_events_url": "https://api.github.com/users/SmerkyG/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38028/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38028/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38027 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38027/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38027/comments | https://api.github.com/repos/huggingface/transformers/issues/38027/events | https://github.com/huggingface/transformers/issues/38027 | 3,049,598,465 | I_kwDOCUB6oc61xS4B | 38,027 | TimeSformer assumes a fixed number of frames in its layers even though it interpolates temporal embeddings based on the input | {
"login": "kamila-chay",
"id": 201148875,
"node_id": "U_kgDOC_1Jyw",
"avatar_url": "https://avatars.githubusercontent.com/u/201148875?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kamila-chay",
"html_url": "https://github.com/kamila-chay",
"followers_url": "https://api.github.com/users/kamila-chay/followers",
"following_url": "https://api.github.com/users/kamila-chay/following{/other_user}",
"gists_url": "https://api.github.com/users/kamila-chay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kamila-chay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kamila-chay/subscriptions",
"organizations_url": "https://api.github.com/users/kamila-chay/orgs",
"repos_url": "https://api.github.com/users/kamila-chay/repos",
"events_url": "https://api.github.com/users/kamila-chay/events{/privacy}",
"received_events_url": "https://api.github.com/users/kamila-chay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-08T17:02:12 | 2025-06-16T08:02:29 | 2025-06-16T08:02:29 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.52.0.dev0
- Platform: Linux-6.11.0-24-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1+cu121 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA GeForce RTX 4070 Laptop GPU
### Who can help?
@amyeroberts @qubvel
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I created a short Jupyter notebook: [Collab](https://colab.research.google.com/drive/1vTp8p2RMGcv5YmcZoljgpEWCiqFl4vmX?usp=sharing)
### Expected behavior
**Timesformer should infer the number of frames dynamically instead of relying on the config, while continuing to infer the image size from config values.**
While it's reasonable for the spatial dimensions (like image_size) to be fixed and defined in the config — since they don't typically vary within a single video — temporal length often does, especially in setups using staggered windows to process long videos at high temporal density.
In these cases, it's more practical if the model dynamically infers the number of frames from the input shape, rather than requiring every chunk to match the fixed num_frames set in the config. This would simplify usage and reduce unnecessary edge-case handling downstream.
[EDIT] This issue also arises when using pretrained temporal embeddings that need to be interpolated to accommodate longer sequences. While the interpolation itself proceeds without error, the attention modules can either fail outright or—more subtly—rely on incorrect attention patterns. Notably, this isn't limited to staggered window configurations; it can occur more broadly whenever temporal dimensions are extended beyond the pretrained length.
`
def forward(self, hidden_states: torch.Tensor, output_attentions: bool = False):
num_frames = self.config.num_frames
num_patch_width = self.config.image_size // self.config.patch_size
batch_size = hidden_states.shape[0]
num_spatial_tokens = (hidden_states.size(1) - 1) // num_frames
num_patch_height = num_spatial_tokens // num_patch_width`
I believe this would make the model more flexible and robust in practice.
Let me know if this direction seems reasonable — I’m happy to open a PR with a proposed fix (also open to creating a custom flag in the config in order to not break BC). | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38027/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38027/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38026 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38026/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38026/comments | https://api.github.com/repos/huggingface/transformers/issues/38026/events | https://github.com/huggingface/transformers/pull/38026 | 3,049,367,030 | PR_kwDOCUB6oc6VdHch | 38,026 | Pass `eps` to `Mistral3RMSNorm` | {
"login": "sergiopaniego",
"id": 17179696,
"node_id": "MDQ6VXNlcjE3MTc5Njk2",
"avatar_url": "https://avatars.githubusercontent.com/u/17179696?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sergiopaniego",
"html_url": "https://github.com/sergiopaniego",
"followers_url": "https://api.github.com/users/sergiopaniego/followers",
"following_url": "https://api.github.com/users/sergiopaniego/following{/other_user}",
"gists_url": "https://api.github.com/users/sergiopaniego/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sergiopaniego/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sergiopaniego/subscriptions",
"organizations_url": "https://api.github.com/users/sergiopaniego/orgs",
"repos_url": "https://api.github.com/users/sergiopaniego/repos",
"events_url": "https://api.github.com/users/sergiopaniego/events{/privacy}",
"received_events_url": "https://api.github.com/users/sergiopaniego/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-08T15:28:54 | 2025-05-19T13:09:44 | 2025-05-19T13:09:26 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38026",
"html_url": "https://github.com/huggingface/transformers/pull/38026",
"diff_url": "https://github.com/huggingface/transformers/pull/38026.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38026.patch",
"merged_at": "2025-05-19T13:09:25"
} | # What does this PR do?
Fixes #38025
This solution is part of the investigation detailed [here](https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503/discussions/74)
@qubvel
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38026/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38026/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/38025 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38025/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38025/comments | https://api.github.com/repos/huggingface/transformers/issues/38025/events | https://github.com/huggingface/transformers/issues/38025 | 3,049,356,202 | I_kwDOCUB6oc61wXuq | 38,025 | `eps` is not passed in `Mistral3RMSNorm` | {
"login": "sergiopaniego",
"id": 17179696,
"node_id": "MDQ6VXNlcjE3MTc5Njk2",
"avatar_url": "https://avatars.githubusercontent.com/u/17179696?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sergiopaniego",
"html_url": "https://github.com/sergiopaniego",
"followers_url": "https://api.github.com/users/sergiopaniego/followers",
"following_url": "https://api.github.com/users/sergiopaniego/following{/other_user}",
"gists_url": "https://api.github.com/users/sergiopaniego/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sergiopaniego/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sergiopaniego/subscriptions",
"organizations_url": "https://api.github.com/users/sergiopaniego/orgs",
"repos_url": "https://api.github.com/users/sergiopaniego/repos",
"events_url": "https://api.github.com/users/sergiopaniego/events{/privacy}",
"received_events_url": "https://api.github.com/users/sergiopaniego/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-08T15:24:30 | 2025-05-19T13:09:27 | 2025-05-19T13:09:27 | MEMBER | null | null | null | null | The class `Mistral3RMSNorm` can receive `eps`:
https://github.com/huggingface/transformers/blob/f2909e024cca9b7846030a91d7d4cafbf0f6208d/src/transformers/models/mistral3/modeling_mistral3.py#L49
but when used in the creation of the Mistral3 model, it is not actually passed
https://github.com/huggingface/transformers/blob/f2909e024cca9b7846030a91d7d4cafbf0f6208d/src/transformers/models/mistral3/modeling_mistral3.py#L109
It could be solved as follows:
```
self.norm = Mistral3RMSNorm(config.vision_config.hidden_size, eps=config.text_config.rms_norm_eps)
``` | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38025/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38025/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38024 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38024/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38024/comments | https://api.github.com/repos/huggingface/transformers/issues/38024/events | https://github.com/huggingface/transformers/issues/38024 | 3,049,277,563 | I_kwDOCUB6oc61wEh7 | 38,024 | while using trainer to train mnist model, 'ValueError: Found input variables with inconsistent numbers of samples: [10000, 8750]' | {
"login": "HaoyaWHL",
"id": 20349520,
"node_id": "MDQ6VXNlcjIwMzQ5NTIw",
"avatar_url": "https://avatars.githubusercontent.com/u/20349520?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HaoyaWHL",
"html_url": "https://github.com/HaoyaWHL",
"followers_url": "https://api.github.com/users/HaoyaWHL/followers",
"following_url": "https://api.github.com/users/HaoyaWHL/following{/other_user}",
"gists_url": "https://api.github.com/users/HaoyaWHL/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HaoyaWHL/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HaoyaWHL/subscriptions",
"organizations_url": "https://api.github.com/users/HaoyaWHL/orgs",
"repos_url": "https://api.github.com/users/HaoyaWHL/repos",
"events_url": "https://api.github.com/users/HaoyaWHL/events{/privacy}",
"received_events_url": "https://api.github.com/users/HaoyaWHL/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-05-08T15:00:00 | 2025-05-14T02:41:57 | 2025-05-14T02:41:57 | NONE | null | null | null | null | It was the problem with the data processing during the training process that led to the inconsistent shapes. | {
"login": "HaoyaWHL",
"id": 20349520,
"node_id": "MDQ6VXNlcjIwMzQ5NTIw",
"avatar_url": "https://avatars.githubusercontent.com/u/20349520?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HaoyaWHL",
"html_url": "https://github.com/HaoyaWHL",
"followers_url": "https://api.github.com/users/HaoyaWHL/followers",
"following_url": "https://api.github.com/users/HaoyaWHL/following{/other_user}",
"gists_url": "https://api.github.com/users/HaoyaWHL/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HaoyaWHL/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HaoyaWHL/subscriptions",
"organizations_url": "https://api.github.com/users/HaoyaWHL/orgs",
"repos_url": "https://api.github.com/users/HaoyaWHL/repos",
"events_url": "https://api.github.com/users/HaoyaWHL/events{/privacy}",
"received_events_url": "https://api.github.com/users/HaoyaWHL/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38024/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38024/timeline | null | not_planned | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/38023 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/38023/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/38023/comments | https://api.github.com/repos/huggingface/transformers/issues/38023/events | https://github.com/huggingface/transformers/pull/38023 | 3,049,151,697 | PR_kwDOCUB6oc6VcYKU | 38,023 | [ESM] Add flash-attention-2 backend for ESM-2 | {
"login": "pstjohn",
"id": 2576846,
"node_id": "MDQ6VXNlcjI1NzY4NDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2576846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pstjohn",
"html_url": "https://github.com/pstjohn",
"followers_url": "https://api.github.com/users/pstjohn/followers",
"following_url": "https://api.github.com/users/pstjohn/following{/other_user}",
"gists_url": "https://api.github.com/users/pstjohn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pstjohn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pstjohn/subscriptions",
"organizations_url": "https://api.github.com/users/pstjohn/orgs",
"repos_url": "https://api.github.com/users/pstjohn/repos",
"events_url": "https://api.github.com/users/pstjohn/events{/privacy}",
"received_events_url": "https://api.github.com/users/pstjohn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-05-08T14:16:36 | 2025-05-16T13:15:34 | 2025-05-16T13:11:56 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/38023",
"html_url": "https://github.com/huggingface/transformers/pull/38023",
"diff_url": "https://github.com/huggingface/transformers/pull/38023.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/38023.patch",
"merged_at": "2025-05-16T13:11:56"
} | # What does this PR do?
Implements the flash-attention-2 attention backend for ESM-2.
ESM-2 is a BERT-based protein language model from facebook research. While the model is undoubtedly a bit old at this point - there are now ESM-3 and ESM-C models available, they remain very popular on the huggingface hub with ESM-2 3B having nearly 6M downloads last month. Despite this, there are no efficient attention backends for ESM-2 available via huggingface. There have been various drop-in replacements for ESM-2 that have been created to fill this need ([FastESM](https://huggingface.co/Synthyra/FastESM2_650), [FAPLM](https://github.com/pengzhangzhi/faplm)), but a native transformers implementation should speed up a lot of workflows that currently rely on the native pytorch attention.
Addresses #26350
Related to this PR; it would be great if we could merge some of the remaining automated `safetensor` conversions for the ESM models on huggingface hub, I noticed @Rocketknight1 merged some for the other model scales previously.
* [ESM-2 3B](https://huggingface.co/facebook/esm2_t36_3B_UR50D/discussions/18)
* [ESM-2 15B](https://huggingface.co/facebook/esm2_t48_15B_UR50D/discussions/1)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/38023/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/38023/timeline | null | null | null | null | true | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.