url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/40139 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40139/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40139/comments | https://api.github.com/repos/huggingface/transformers/issues/40139/events | https://github.com/huggingface/transformers/pull/40139 | 3,319,311,819 | PR_kwDOCUB6oc6jiAtY | 40,139 | gpt oss is important | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T17:44:14 | 2025-08-13T17:57:45 | 2025-08-13T17:49:54 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40139",
"html_url": "https://github.com/huggingface/transformers/pull/40139",
"diff_url": "https://github.com/huggingface/transformers/pull/40139.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40139.patch",
"merged_at": "2025-08-13T17:49:54"
} | # What does this PR do?
Make sure we run slow test for it | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40139/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40139/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40138 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40138/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40138/comments | https://api.github.com/repos/huggingface/transformers/issues/40138/events | https://github.com/huggingface/transformers/pull/40138 | 3,319,278,849 | PR_kwDOCUB6oc6jh55e | 40,138 | [docs] Fix ko toctree | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T17:31:07 | 2025-08-13T18:25:00 | 2025-08-13T18:24:59 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40138",
"html_url": "https://github.com/huggingface/transformers/pull/40138",
"diff_url": "https://github.com/huggingface/transformers/pull/40138.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40138.patch",
"merged_at": "2025-08-13T18:24:58"
} | Fixes `grounding_dino.md` to `grounding-dino.md` introduced in https://github.com/huggingface/transformers/pull/39861 | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40138/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40138/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40137 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40137/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40137/comments | https://api.github.com/repos/huggingface/transformers/issues/40137/events | https://github.com/huggingface/transformers/pull/40137 | 3,319,009,740 | PR_kwDOCUB6oc6jhBjD | 40,137 | 🚨🚨 Switch default compilation to fullgraph=False | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T16:01:10 | 2025-08-20T09:41:39 | 2025-08-19T09:26:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40137",
"html_url": "https://github.com/huggingface/transformers/pull/40137",
"diff_url": "https://github.com/huggingface/transformers/pull/40137.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40137.patch",
"merged_at": "2025-08-19T09:26:22"
} | # What does this PR do?
As we sometimes talked offline. Graph breaks are in general not a problem, and requiring fullgraph=True is too restrictive!
Would also allow us to use `torch.compiler.disable` where needed sometimes, to skip bigger regions which are problematic and not worth it | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40137/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40137/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40136 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40136/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40136/comments | https://api.github.com/repos/huggingface/transformers/issues/40136/events | https://github.com/huggingface/transformers/issues/40136 | 3,318,950,137 | I_kwDOCUB6oc7F0yj5 | 40,136 | Qwen2.5-VL-7B-Instruct: Significant accuracy regression on MMMU benchmark with transformers >=4.54.0 | {
"login": "rahul-tuli",
"id": 25380596,
"node_id": "MDQ6VXNlcjI1MzgwNTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/25380596?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rahul-tuli",
"html_url": "https://github.com/rahul-tuli",
"followers_url": "https://api.github.com/users/rahul-tuli/followers",
"following_url": "https://api.github.com/users/rahul-tuli/following{/other_user}",
"gists_url": "https://api.github.com/users/rahul-tuli/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rahul-tuli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rahul-tuli/subscriptions",
"organizations_url": "https://api.github.com/users/rahul-tuli/orgs",
"repos_url": "https://api.github.com/users/rahul-tuli/repos",
"events_url": "https://api.github.com/users/rahul-tuli/events{/privacy}",
"received_events_url": "https://api.github.com/users/rahul-tuli/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-13T15:42:14 | 2025-09-26T13:42:21 | 2025-09-01T09:10:42 | CONTRIBUTOR | null | null | null | null | ### System Info
## Description
We've observed a significant accuracy regression when evaluating Qwen2.5-VL-7B-Instruct on the MMMU Literature benchmark after upgrading from transformers 4.53.3 to 4.54.0. The accuracy drops from **86.67%** to **73.33%**, representing a **13.34 percentage point decrease** (~15.4% relative regression).
## Environment
- **Model**: Qwen/Qwen2.5-VL-7B-Instruct
- **Evaluation framework**: lm-eval
- **Task**: mmmu_val_literature (30 samples)
- **Hardware**: CUDA-enabled GPU
- **Python**: 3.10
- **Other dependencies**: torch, torchvision, accelerate, Pillow
### Transformers 4.53.3 Output
```
hf-multimodal (pretrained=Qwen/Qwen2.5-VL-7B-Instruct,dtype=bfloat16,add_bos_token=True,convert_img_format=True), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: 8
| Tasks |Version|Filter|n-shot|Metric| |Value | |Stderr|
|----------|------:|------|-----:|------|---|-----:|---|-----:|
|Literature| 0|none | 0|acc |↑ |0.8667|± |0.0631|
```
### Transformers 4.54.0 Output
```
hf-multimodal (pretrained=Qwen/Qwen2.5-VL-7B-Instruct,dtype=bfloat16,add_bos_token=True,convert_img_format=True), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: 8
| Tasks |Version|Filter|n-shot|Metric| |Value | |Stderr|
|----------|------:|------|-----:|------|---|-----:|---|-----:|
|Literature| 0|none | 0|acc |↑ |0.7333|± |0.0821|
```
## Summary
| Version | Accuracy | Stderr | Difference from baseline |
|---------|----------|--------|-------------------------|
| 4.53.3 | 0.8667 | ±0.0631| - |
| 4.54.0 | 0.7333 | ±0.0821| -0.1334 (-15.4%) |
## Possible causes
The regression may be related to:
- Changes in the default image processor (slow → fast)
- Changes in the vision model loading mechanism
- Modifications to the multimodal pipeline
## Impact
This regression affects production systems using Qwen2.5-VL models for visual question answering and multimodal understanding tasks. Users upgrading to 4.54.0 may experience unexpected accuracy drops.
## Additional context
- The evaluation uses a fixed random seed for reproducibility
- Both tests were run on the same hardware with identical configurations
- The regression is statistically significant given the standard errors
## Steps to Reproduce
### 1. Install dependencies
```bash
pip install lm-eval torch torchvision accelerate Pillow
```
### 2. Test with transformers 4.53.3 (baseline)
```bash
pip install transformers==4.53.3
lm_eval \
--model hf-multimodal \
--model_args "pretrained=Qwen/Qwen2.5-VL-7B-Instruct,dtype=bfloat16,add_bos_token=True,convert_img_format=True" \
--tasks mmmu_val_literature \
--num_fewshot 0 \
--batch_size 8 \
--verbosity INFO
```
### 3. Test with transformers 4.54.0
```bash
pip install transformers==4.54.0
lm_eval \
--model hf-multimodal \
--model_args "pretrained=Qwen/Qwen2.5-VL-7B-Instruct,dtype=bfloat16,add_bos_token=True,convert_img_format=True" \
--tasks mmmu_val_literature \
--num_fewshot 0 \
--batch_size 8 \
--verbosity INFO
```
## Expected behavior
The model accuracy should remain consistent or improve. A 15% relative performance regression on a standard benchmark is unexpected and concerning. | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40136/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40136/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40135 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40135/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40135/comments | https://api.github.com/repos/huggingface/transformers/issues/40135/events | https://github.com/huggingface/transformers/pull/40135 | 3,318,906,227 | PR_kwDOCUB6oc6jgrYm | 40,135 | 🚨🚨 [generate] ignore `cache_implementation="hybrid"` hub defaults | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T15:27:52 | 2025-08-13T15:59:36 | 2025-08-13T15:57:41 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40135",
"html_url": "https://github.com/huggingface/transformers/pull/40135",
"diff_url": "https://github.com/huggingface/transformers/pull/40135.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40135.patch",
"merged_at": "2025-08-13T15:57:41"
} | # What does this PR do?
Follow-up to #40039 -- on models where the hub checkpoint specifies `cache_implementation="hybrid"` (static sliding window hybrid cache), UNSETS this value. This will make the model use the dynamic sliding window layers by default.
Any user-defined `cache_implementation="hybrid"` will apply the static sliding window hybrid cache.
✅ with this PR, we no longer suffer from super slow 1st `generate` calls on models with hybrid caches. See snippet below.
🚨🚨
- BC breaking: the cache returned by `generate` calls now will have `DynamicSlidingWindowLayer` layers by default (as opposed to `SlidingWindowLayer`, its static-shaped counterpart).
- Tests: we might have failing slow tests after merging this PR. Tests using checkpoints with `cache_implementation="hybrid"` may have small differences in their `generate` calls.
___________________________________
Time of 1st `generate` call test snippet:
```py
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
from time import time
model_id = "google/gemma-3-1b-it"
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype=torch.bfloat16)
tokenizer = AutoTokenizer.from_pretrained(model_id)
prompt = "What is the capital of France?"
model_inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
torch.cuda.synchronize()
start = time()
generate_outputs = model.generate(
**model_inputs, max_new_tokens=100, do_sample=False, return_dict_in_generate=True
)
torch.cuda.synchronize()
end = time()
print(f"Time taken: {end - start} seconds (new)")
# time on RTX 4090: 0.79s
torch.cuda.synchronize()
start = time()
generate_outputs = model.generate(
**model_inputs, max_new_tokens=100, do_sample=False, return_dict_in_generate=True, cache_implementation="hybrid"
)
torch.cuda.synchronize()
end = time()
print(f"Time taken: {end - start} seconds (old)")
# time on RTX 4090: 21.62s
``` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40135/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40135/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40134 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40134/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40134/comments | https://api.github.com/repos/huggingface/transformers/issues/40134/events | https://github.com/huggingface/transformers/pull/40134 | 3,318,865,233 | PR_kwDOCUB6oc6jgipf | 40,134 | Create self-scheduled-amd-mi355-caller.yml | {
"login": "glegendre01",
"id": 115986922,
"node_id": "U_kgDOBunR6g",
"avatar_url": "https://avatars.githubusercontent.com/u/115986922?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/glegendre01",
"html_url": "https://github.com/glegendre01",
"followers_url": "https://api.github.com/users/glegendre01/followers",
"following_url": "https://api.github.com/users/glegendre01/following{/other_user}",
"gists_url": "https://api.github.com/users/glegendre01/gists{/gist_id}",
"starred_url": "https://api.github.com/users/glegendre01/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/glegendre01/subscriptions",
"organizations_url": "https://api.github.com/users/glegendre01/orgs",
"repos_url": "https://api.github.com/users/glegendre01/repos",
"events_url": "https://api.github.com/users/glegendre01/events{/privacy}",
"received_events_url": "https://api.github.com/users/glegendre01/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T15:15:48 | 2025-08-14T05:19:13 | 2025-08-13T23:33:45 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40134",
"html_url": "https://github.com/huggingface/transformers/pull/40134",
"diff_url": "https://github.com/huggingface/transformers/pull/40134.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40134.patch",
"merged_at": "2025-08-13T23:33:45"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40134/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40134/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40132 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40132/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40132/comments | https://api.github.com/repos/huggingface/transformers/issues/40132/events | https://github.com/huggingface/transformers/pull/40132 | 3,318,321,342 | PR_kwDOCUB6oc6jetzV | 40,132 | MoE + vllm = 😻 | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 7510456769,
"node_id": "LA_kwDOCUB6oc8AAAABv6h5wQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Mixture%20of%20Experts",
"name": "Mixture of Experts",
"color": "DDB5D0",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-13T12:42:51 | 2025-10-02T10:12:46 | 2025-10-02T10:12:45 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40132",
"html_url": "https://github.com/huggingface/transformers/pull/40132",
"diff_url": "https://github.com/huggingface/transformers/pull/40132.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40132.patch",
"merged_at": "2025-10-02T10:12:45"
} | # What does this PR do?
Starting with vllm, but also towards better training
TODO:
- DBRX, GPTSAN, ERNIE4.5, Jamba (ça été fait par ia je crois), Minmax pareil, NllbMoe, PhiMoe (AI based), switch et vitpose)
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40132/reactions",
"total_count": 10,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 10,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40132/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40131 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40131/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40131/comments | https://api.github.com/repos/huggingface/transformers/issues/40131/events | https://github.com/huggingface/transformers/pull/40131 | 3,318,139,874 | PR_kwDOCUB6oc6jeGHA | 40,131 | add missing Arabic translations | {
"login": "genmnz",
"id": 109829271,
"node_id": "U_kgDOBovclw",
"avatar_url": "https://avatars.githubusercontent.com/u/109829271?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/genmnz",
"html_url": "https://github.com/genmnz",
"followers_url": "https://api.github.com/users/genmnz/followers",
"following_url": "https://api.github.com/users/genmnz/following{/other_user}",
"gists_url": "https://api.github.com/users/genmnz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/genmnz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/genmnz/subscriptions",
"organizations_url": "https://api.github.com/users/genmnz/orgs",
"repos_url": "https://api.github.com/users/genmnz/repos",
"events_url": "https://api.github.com/users/genmnz/events{/privacy}",
"received_events_url": "https://api.github.com/users/genmnz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-13T11:49:09 | 2025-08-15T06:00:37 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40131",
"html_url": "https://github.com/huggingface/transformers/pull/40131",
"diff_url": "https://github.com/huggingface/transformers/pull/40131.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40131.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40131/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40131/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40130 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40130/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40130/comments | https://api.github.com/repos/huggingface/transformers/issues/40130/events | https://github.com/huggingface/transformers/pull/40130 | 3,318,106,120 | PR_kwDOCUB6oc6jd-nZ | 40,130 | fix error vocab_size at Qwen2_5_VLForConditionalGeneration loss_function | {
"login": "killight98",
"id": 32491510,
"node_id": "MDQ6VXNlcjMyNDkxNTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/32491510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/killight98",
"html_url": "https://github.com/killight98",
"followers_url": "https://api.github.com/users/killight98/followers",
"following_url": "https://api.github.com/users/killight98/following{/other_user}",
"gists_url": "https://api.github.com/users/killight98/gists{/gist_id}",
"starred_url": "https://api.github.com/users/killight98/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/killight98/subscriptions",
"organizations_url": "https://api.github.com/users/killight98/orgs",
"repos_url": "https://api.github.com/users/killight98/repos",
"events_url": "https://api.github.com/users/killight98/events{/privacy}",
"received_events_url": "https://api.github.com/users/killight98/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T11:39:20 | 2025-08-18T08:59:26 | 2025-08-18T08:59:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40130",
"html_url": "https://github.com/huggingface/transformers/pull/40130",
"diff_url": "https://github.com/huggingface/transformers/pull/40130.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40130.patch",
"merged_at": "2025-08-18T08:59:25"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Based on changes at PR #37033, `vocab_size` should be obtained from `self.config.text_config`, not `self.config` at Qwen2_5_VLForConditionalGeneration loss_function calling.
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40130/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40130/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40129 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40129/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40129/comments | https://api.github.com/repos/huggingface/transformers/issues/40129/events | https://github.com/huggingface/transformers/pull/40129 | 3,318,046,249 | PR_kwDOCUB6oc6jdxXT | 40,129 | docs: Update LayoutLM model card according to new standardized format | {
"login": "Jin-HoMLee",
"id": 123657753,
"node_id": "U_kgDOB17eGQ",
"avatar_url": "https://avatars.githubusercontent.com/u/123657753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jin-HoMLee",
"html_url": "https://github.com/Jin-HoMLee",
"followers_url": "https://api.github.com/users/Jin-HoMLee/followers",
"following_url": "https://api.github.com/users/Jin-HoMLee/following{/other_user}",
"gists_url": "https://api.github.com/users/Jin-HoMLee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jin-HoMLee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jin-HoMLee/subscriptions",
"organizations_url": "https://api.github.com/users/Jin-HoMLee/orgs",
"repos_url": "https://api.github.com/users/Jin-HoMLee/repos",
"events_url": "https://api.github.com/users/Jin-HoMLee/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jin-HoMLee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T11:22:53 | 2025-08-19T09:31:30 | 2025-08-15T16:33:47 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40129",
"html_url": "https://github.com/huggingface/transformers/pull/40129",
"diff_url": "https://github.com/huggingface/transformers/pull/40129.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40129.patch",
"merged_at": "2025-08-15T16:33:47"
} | # Update LayoutLM Model Card Documentation
This PR updates the LayoutLM model card documentation according to the standardized format as issued in https://github.com/huggingface/transformers/issues/36979. The changes improve the documentation's clarity and usability while maintaining consistency with other model cards in the repository.
Changes include:
- A badge indicating model support (PyTorch, TensorFlow) in the upper right side of the doc
- A brief description of the model (what makes it unique/different) written in a way that's accessible to everyone
- Ready to use example code
- featuring AutoModel
- featuring quantization using TorchAoConfig
- Short explanations that Pipeline and transformers-cli are not supported
- Notes
- that additional pre-processing steps are needed with link to further resources
- and reference on the improved, most recent LayoutLMv3 model
- that attention mask visualizer is not supported
The changes make the documentation more accessible and provide ready-to-use examples for different use cases, following the standardized format used in other model cards like Gemma 3, PaliGemma, and ViT.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Since this is a documentation update for a vision-language model, I would suggest tagging:
@amyeroberts (vision models)
@stevhliu (documentation)
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40129/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40129/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40128 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40128/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40128/comments | https://api.github.com/repos/huggingface/transformers/issues/40128/events | https://github.com/huggingface/transformers/pull/40128 | 3,317,870,893 | PR_kwDOCUB6oc6jdKaQ | 40,128 | Add `chat_template` (`jinja2`) as an extra dependency | {
"login": "tboerstad",
"id": 4872288,
"node_id": "MDQ6VXNlcjQ4NzIyODg=",
"avatar_url": "https://avatars.githubusercontent.com/u/4872288?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tboerstad",
"html_url": "https://github.com/tboerstad",
"followers_url": "https://api.github.com/users/tboerstad/followers",
"following_url": "https://api.github.com/users/tboerstad/following{/other_user}",
"gists_url": "https://api.github.com/users/tboerstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tboerstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tboerstad/subscriptions",
"organizations_url": "https://api.github.com/users/tboerstad/orgs",
"repos_url": "https://api.github.com/users/tboerstad/repos",
"events_url": "https://api.github.com/users/tboerstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/tboerstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 9105758243,
"node_id": "LA_kwDOCUB6oc8AAAACHr7YIw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?",
"name": "for_v5?",
"color": "35BC94",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-13T10:32:38 | 2025-08-19T06:49:29 | 2025-08-18T14:31:40 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40128",
"html_url": "https://github.com/huggingface/transformers/pull/40128",
"diff_url": "https://github.com/huggingface/transformers/pull/40128.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40128.patch",
"merged_at": "2025-08-18T14:31:40"
} | # What does this PR do?
This PR adds `chat_template` as an extra dependency.
`jinja2` is required by [chat_template_utils.py](https://github.com/huggingface/transformers/blob/060b86e21d0460fbd2b6e642d0fbc8f19f679039/src/transformers/utils/chat_template_utils.py#L393C1-L394C1), but it's not declared in setup.py.
There's been earlier issues on this topic (see #34397 and #35533).
The reason for not declaring `jinja2` as a dependency is that it's dragged along by `torch`, and [`torch` "always" gets installed anyway](https://github.com/huggingface/transformers/pull/35533#issuecomment-2596031785).
For inference, it's not a given that `torch` will be installed imho.
For example the [MAX framework](https://github.com/modular/modular) depends on `transformers`, but not on `torch`.
For transparency, I work at Modular which develops MAX.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@Rocketknight1 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40128/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40128/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40127 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40127/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40127/comments | https://api.github.com/repos/huggingface/transformers/issues/40127/events | https://github.com/huggingface/transformers/pull/40127 | 3,317,774,701 | PR_kwDOCUB6oc6jc1Ns | 40,127 | [trainer] handle case where EOS token is None in `generation_config` | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T10:07:00 | 2025-08-13T14:57:24 | 2025-08-13T14:57:17 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40127",
"html_url": "https://github.com/huggingface/transformers/pull/40127",
"diff_url": "https://github.com/huggingface/transformers/pull/40127.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40127.patch",
"merged_at": "2025-08-13T14:57:17"
} | # What does this PR do?
Follow-up to #38441: the added special token handling caused crashes when the EOS token is undefined/None in an existing `generation_config`
This PR also:
- improves the thrown warning
- updates the eli5 dataset in the CLM task (whose code I've used to debug this change -- the old dataset is not compatible with `datasets` 4.0) | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40127/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40127/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40126 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40126/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40126/comments | https://api.github.com/repos/huggingface/transformers/issues/40126/events | https://github.com/huggingface/transformers/issues/40126 | 3,317,637,696 | I_kwDOCUB6oc7FvyJA | 40,126 | slide attention bug for qwen3 model | {
"login": "Yufei-Z",
"id": 39251819,
"node_id": "MDQ6VXNlcjM5MjUxODE5",
"avatar_url": "https://avatars.githubusercontent.com/u/39251819?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Yufei-Z",
"html_url": "https://github.com/Yufei-Z",
"followers_url": "https://api.github.com/users/Yufei-Z/followers",
"following_url": "https://api.github.com/users/Yufei-Z/following{/other_user}",
"gists_url": "https://api.github.com/users/Yufei-Z/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Yufei-Z/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Yufei-Z/subscriptions",
"organizations_url": "https://api.github.com/users/Yufei-Z/orgs",
"repos_url": "https://api.github.com/users/Yufei-Z/repos",
"events_url": "https://api.github.com/users/Yufei-Z/events{/privacy}",
"received_events_url": "https://api.github.com/users/Yufei-Z/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-13T09:27:20 | 2025-09-21T08:02:12 | 2025-09-21T08:02:12 | NONE | null | null | null | null | ### System Info
https://github.com/huggingface/transformers/blob/89c46b648d82b670cc7286a25fa64d2d92770418/src/transformers/models/qwen3/configuration_qwen3.py#L210C9-L216C14
```
if self.layer_types is None:
self.layer_types = [
"sliding_attention"
if self.sliding_window is not None and i >= self.max_window_layers
else "full_attention"
for i in range(self.num_hidden_layers)
]
```
max_window_layers
In models like Mistral or Llama with sliding window attention, max_window_layers specifies the number of transformer layers where the sliding window mechanism is disabled.
For layers below this value, standard global attention is applied.
vLLM’s Constraint
When using vLLM, the requirement max_window_layers = num_hidden_layers effectively disables sliding window attention entirely across all layers.
If VLLM's sliding attention is implemented in the same way or simply using transformer to initialize the model, the sliding window will never take effect
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
just catch it, didn't try yet
### Expected behavior
check if this is expected | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40126/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40126/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40125 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40125/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40125/comments | https://api.github.com/repos/huggingface/transformers/issues/40125/events | https://github.com/huggingface/transformers/issues/40125 | 3,317,320,524 | I_kwDOCUB6oc7FuktM | 40,125 | Truncation 'only_second' does not work for multimodal chat templates. | {
"login": "karol-szustakowski",
"id": 191375020,
"node_id": "U_kgDOC2gmrA",
"avatar_url": "https://avatars.githubusercontent.com/u/191375020?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/karol-szustakowski",
"html_url": "https://github.com/karol-szustakowski",
"followers_url": "https://api.github.com/users/karol-szustakowski/followers",
"following_url": "https://api.github.com/users/karol-szustakowski/following{/other_user}",
"gists_url": "https://api.github.com/users/karol-szustakowski/gists{/gist_id}",
"starred_url": "https://api.github.com/users/karol-szustakowski/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/karol-szustakowski/subscriptions",
"organizations_url": "https://api.github.com/users/karol-szustakowski/orgs",
"repos_url": "https://api.github.com/users/karol-szustakowski/repos",
"events_url": "https://api.github.com/users/karol-szustakowski/events{/privacy}",
"received_events_url": "https://api.github.com/users/karol-szustakowski/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-13T07:54:12 | 2025-09-21T08:02:14 | 2025-09-21T08:02:14 | NONE | null | null | null | null | ### System Info
Python 3.9.21
transformers 4.51.3
### Who can help?
@amyeroberts (vision) @ArthurZucker (tokenizers)
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Please refer to this minimal reproducible example:
```py
from transformers import AutoProcessor
import numpy as np
LENGTH_LIMIT = 1210
MODEL_ID = "llava-hf/llava-v1.6-vicuna-13b-hf"
processor = AutoProcessor.from_pretrained(MODEL_ID, use_fast=False)
processor.tokenizer.padding_side = "right"
image = np.random.rand(3, 150, 150)
conversation = [
{"role": "user", "content": [{"type": "image"}]},
{"role": "assistant", "content": [{"type": "text", "text": ""}]}
]
for i in range(10):
conv_enc = processor.apply_chat_template(conversation)
print("Encoded conversation length:", len(conv_enc))
batch = processor(images=[image], text=conv_enc, return_tensors="pt", truncation='only_second', max_length=LENGTH_LIMIT)
print("Batch size:", batch["input_ids"].shape[1], " out of ", LENGTH_LIMIT)
conversation[1]["content"][0]["text"] = "abc"*10*i
print("======")
```
### Expected behavior
I would expect the caption (the response of the assistant) to be truncated once the total amount of tokens reaches 1210.
Instead, the output we see, is the following:
```
Encoded conversation length: 26
Batch size: 1187 out of 1210
======
Encoded conversation length: 26
Batch size: 1187 out of 1210
======
Encoded conversation length: 56
Batch size: 1197 out of 1210
======
Encoded conversation length: 86
Batch size: 1207 out of 1210
======
Encoded conversation length: 116
Batch size: 1217 out of 1210
======
Encoded conversation length: 146
Batch size: 1227 out of 1210
======
Encoded conversation length: 176
Batch size: 1237 out of 1210
======
Encoded conversation length: 206
Batch size: 1247 out of 1210
======
Encoded conversation length: 236
Batch size: 1257 out of 1210
======
Encoded conversation length: 266
Batch size: 1267 out of 1210
======
```
Evidently, the second sequence is not truncated.
The reason for this becomes obvious once I set use_fast=True - rust tokenizers actually give an informative error:
```
Exception: Truncation error: Second sequence not provided
```
At this point, it is obvious to me, that when using a multimodal template, 'only_second' refers to the second element of each "content" key. I'm not sure if this behaviour is intended. Indeed, I would expect the processor to pass the entire input of the user and only truncate the response. This is especially true for image inputs, for which the number of features MUST match the length of placeholder tokens. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40125/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40125/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40124 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40124/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40124/comments | https://api.github.com/repos/huggingface/transformers/issues/40124/events | https://github.com/huggingface/transformers/pull/40124 | 3,317,192,039 | PR_kwDOCUB6oc6ja2vd | 40,124 | Ensure testing MobileNetV2 image processor for both slow and fast | {
"login": "namgyu-youn",
"id": 152387005,
"node_id": "U_kgDOCRU9vQ",
"avatar_url": "https://avatars.githubusercontent.com/u/152387005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/namgyu-youn",
"html_url": "https://github.com/namgyu-youn",
"followers_url": "https://api.github.com/users/namgyu-youn/followers",
"following_url": "https://api.github.com/users/namgyu-youn/following{/other_user}",
"gists_url": "https://api.github.com/users/namgyu-youn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/namgyu-youn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/namgyu-youn/subscriptions",
"organizations_url": "https://api.github.com/users/namgyu-youn/orgs",
"repos_url": "https://api.github.com/users/namgyu-youn/repos",
"events_url": "https://api.github.com/users/namgyu-youn/events{/privacy}",
"received_events_url": "https://api.github.com/users/namgyu-youn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T07:13:35 | 2025-09-30T06:12:25 | 2025-09-30T06:12:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40124",
"html_url": "https://github.com/huggingface/transformers/pull/40124",
"diff_url": "https://github.com/huggingface/transformers/pull/40124.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40124.patch",
"merged_at": null
} | ### What does this PR do?
- Remove test skipping for importing `is_torch_available & is_vision_available`, because test result without it doesn't make sense.
- Incline (run directly) unittest for `MobileNetV2ImageProcessor` and `MobileNetV2ImageProcessorFast`
- Related Issue/PR: https://github.com/huggingface/transformers/pull/40086
### Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
### Who can review?
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
cc @yonigozlan | {
"login": "namgyu-youn",
"id": 152387005,
"node_id": "U_kgDOCRU9vQ",
"avatar_url": "https://avatars.githubusercontent.com/u/152387005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/namgyu-youn",
"html_url": "https://github.com/namgyu-youn",
"followers_url": "https://api.github.com/users/namgyu-youn/followers",
"following_url": "https://api.github.com/users/namgyu-youn/following{/other_user}",
"gists_url": "https://api.github.com/users/namgyu-youn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/namgyu-youn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/namgyu-youn/subscriptions",
"organizations_url": "https://api.github.com/users/namgyu-youn/orgs",
"repos_url": "https://api.github.com/users/namgyu-youn/repos",
"events_url": "https://api.github.com/users/namgyu-youn/events{/privacy}",
"received_events_url": "https://api.github.com/users/namgyu-youn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40124/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40124/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40123 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40123/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40123/comments | https://api.github.com/repos/huggingface/transformers/issues/40123/events | https://github.com/huggingface/transformers/pull/40123 | 3,316,821,929 | PR_kwDOCUB6oc6jZo-f | 40,123 | Lazily import torchao Int4WeightOnlyConfig to avoid side effects | {
"login": "n0kovo",
"id": 16690056,
"node_id": "MDQ6VXNlcjE2NjkwMDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/16690056?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/n0kovo",
"html_url": "https://github.com/n0kovo",
"followers_url": "https://api.github.com/users/n0kovo/followers",
"following_url": "https://api.github.com/users/n0kovo/following{/other_user}",
"gists_url": "https://api.github.com/users/n0kovo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/n0kovo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/n0kovo/subscriptions",
"organizations_url": "https://api.github.com/users/n0kovo/orgs",
"repos_url": "https://api.github.com/users/n0kovo/repos",
"events_url": "https://api.github.com/users/n0kovo/events{/privacy}",
"received_events_url": "https://api.github.com/users/n0kovo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-13T04:04:40 | 2025-08-25T13:25:51 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40123",
"html_url": "https://github.com/huggingface/transformers/pull/40123",
"diff_url": "https://github.com/huggingface/transformers/pull/40123.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40123.patch",
"merged_at": null
} | Previously, `modeling_utils` imported `torchao.quantization.Int4WeightOnlyConfig` at module load time if `torchao` was available.
On CPU-only and Mac/Windows systems, this triggered a `torch.distributed.elastic` redirect [warning](https://github.com/pytorch/pytorch/blob/ba47821f524eee50a214ed39fa2e7765d54aabf4/torch/distributed/elastic/multiprocessing/redirects.py#L27) and a Triton `ImportError` print on certain versions (https://github.com/pytorch/ao/pull/1842, https://github.com/pytorch/ao/pull/1808):
```shell
>>> from transformers import AutoProcessor
import error: No module named 'triton'
W0813 05:05:53.890000 17158 site-packages/torch/distributed/elastic/multiprocessing/redirects.py:29] NOTE: Redirects are currently not supported in Windows or MacOs.
```
This change replaces the eager import with a lazy import inside _load_shard_file(), preserving existing behavior while avoiding unnecessary side effects.
cc @MekkCyber
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40123/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40123/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40122 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40122/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40122/comments | https://api.github.com/repos/huggingface/transformers/issues/40122/events | https://github.com/huggingface/transformers/issues/40122 | 3,316,759,426 | I_kwDOCUB6oc7FsbuC | 40,122 | wrong onnx output of OPUS en-ar | {
"login": "logicvv",
"id": 130044942,
"node_id": "U_kgDOB8BUDg",
"avatar_url": "https://avatars.githubusercontent.com/u/130044942?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/logicvv",
"html_url": "https://github.com/logicvv",
"followers_url": "https://api.github.com/users/logicvv/followers",
"following_url": "https://api.github.com/users/logicvv/following{/other_user}",
"gists_url": "https://api.github.com/users/logicvv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/logicvv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/logicvv/subscriptions",
"organizations_url": "https://api.github.com/users/logicvv/orgs",
"repos_url": "https://api.github.com/users/logicvv/repos",
"events_url": "https://api.github.com/users/logicvv/events{/privacy}",
"received_events_url": "https://api.github.com/users/logicvv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | open | false | null | [] | null | [] | 2025-08-13T03:17:16 | 2025-10-16T13:09:05 | null | NONE | null | null | null | null | ### System Info
transformers == 4.28
python == 3.9
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Here is my decoder onnx transfer code:
```
class DecoderWithLMHead(torch.nn.Module):
def __init__(self, model):
super().__init__()
self.decoder = model.model.decoder
self.lm_head = model.lm_head
self.final_logits_bias = model.final_logits_bias
def forward(self, input_ids, encoder_hidden_states, encoder_attention_mask):
dec_out = self.decoder(
input_ids=input_ids,
encoder_hidden_states=encoder_hidden_states,
encoder_attention_mask=encoder_attention_mask
)
hidden_states = dec_out.last_hidden_state
logits = self.lm_head(hidden_states) + self.final_logits_bias
return logits
def decoder_onnx_export():
decoder = model.model.decoder
decoder.eval()
decoder_with_lm = DecoderWithLMHead(model)
dummy_input_ids = torch.randint(0, tokenizer.vocab_size, (1, 64)) # (batch, tgt_seq_len)
dummy_encoder_attention_mask = torch.ones((1, 64), dtype=torch.long) # (batch, src_seq_len)
dummy_encoder_outputs = torch.rand((1, 64, 1024)) # encoder_hidden_states
torch.onnx.export(
decoder_with_lm,
args=(
dummy_input_ids,
),
kwargs={
"encoder_hidden_states": dummy_encoder_outputs,
"encoder_attention_mask": dummy_encoder_attention_mask,
},
f="decoder_with_lm.onnx",
input_names=[
"input_ids",
"encoder_hidden_states",
"encoder_attention_mask",
],
output_names=["logits"],
opset_version=17,
`)```
```
The original model from hugging face is good, but the onnx output was wrong:
inference code:
```
input_sentence = 'Using handheld GPS devices and programs like Google Earth , members of the Trio Tribe , who live in the rainforests of southern Suriname , map out their ancestral lands to help strengthen their territorial claims .'
input_ids, attention_mask = preprocess_input(input_sentence, tokenizer, max_length)
encoder_outputs = encoder_session.run(['hidden_states'], {
'input_ids': input_ids.astype(np.int64),
'attention_mask': attention_mask.astype(np.int64)
})
decoder_input_ids = np.full((1, 64), 61246, dtype=np.int64)
decoder_attention_mask = np.full((1, 64), 1, dtype=np.int64)
decoder_input_ids[0, 0] = start_id
decoder_input_ids = np.array(decoder_input_ids)
# print(decoder_input_ids)
# print(len(decoder_input_ids))
# exit()
# 用于存储每个时间步的 logits
all_output_token_id = []
count += 1
decoder_count = 0
# 4. 逐步解码
for step in range(max_length-1):
outputs = decoder_session.run(
['logits'], # logits
{
'input_ids': decoder_input_ids.astype(np.int64), # Decoder
# 'attention_mask':decoder_attention_mask,
'encoder_hidden_states': encoder_hidden_states, # Encoder
'encoder_attention_mask': attention_mask.astype(np.int64)
}
)
current_logits = outputs[0][0, step + 1, :]
next_token_id = np.argmax(current_logits).item() #
if step + 1 < 64:
decoder_input_ids[0, step + 1] = next_token_id
if next_token_id == tokenizer.eos_token_id:
print(next_token_id)
print('end')
break
all_output_token_id.append(next_token_id)
# print(all_output_token_id)
output_sentence = tokenizer.decode(all_output_token_id, skip_special_tokens=True)
# output_sentence = sp_t.decode(all_output_token_id)
print(output_sentence)
```
huggingface output:
باستخدام أجهزة GPS المحمولة وبرامج مثل Google Earth ، يقوم أعضاء Trio Tribe ، الذين يعيشون في الغابات المطيرة في جنوب سورينام ، برسم خرائط لأراضي أجدادهم للمساعدة في تعزيز مطالبهم الإقليمية.
onnx output:
باستخدام أجهزة GPS المحمولة وبرامج مثل جوجل أعضاء Tri، الذين يعيشون في الغابات جنوب سورينام رسم أراضي الفوركس للمساعدة تعزيز الإقليمية..
Could anyone help please?
### Expected behavior
huggingface output:
باستخدام أجهزة GPS المحمولة وبرامج مثل Google Earth ، يقوم أعضاء Trio Tribe ، الذين يعيشون في الغابات المطيرة في جنوب سورينام ، برسم خرائط لأراضي أجدادهم للمساعدة في تعزيز مطالبهم الإقليمية.
onnx output:
باستخدام أجهزة GPS المحمولة وبرامج مثل جوجل أعضاء Tri، الذين يعيشون في الغابات جنوب سورينام رسم أراضي الفوركس للمساعدة تعزيز الإقليمية.. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40122/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40122/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40121 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40121/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40121/comments | https://api.github.com/repos/huggingface/transformers/issues/40121/events | https://github.com/huggingface/transformers/pull/40121 | 3,316,585,149 | PR_kwDOCUB6oc6jY4MJ | 40,121 | SmolVLM and InternVL: Ensure pixel values are converted to the correct dtype for fp16/bf16 | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T01:11:04 | 2025-08-19T17:39:10 | 2025-08-19T17:39:08 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40121",
"html_url": "https://github.com/huggingface/transformers/pull/40121",
"diff_url": "https://github.com/huggingface/transformers/pull/40121.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40121.patch",
"merged_at": "2025-08-19T17:39:08"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40121/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40121/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40120 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40120/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40120/comments | https://api.github.com/repos/huggingface/transformers/issues/40120/events | https://github.com/huggingface/transformers/pull/40120 | 3,316,449,643 | PR_kwDOCUB6oc6jYaL_ | 40,120 | fix bart issue w/ static cache | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-13T00:18:46 | 2025-08-19T18:45:27 | 2025-08-19T15:27:50 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40120",
"html_url": "https://github.com/huggingface/transformers/pull/40120",
"diff_url": "https://github.com/huggingface/transformers/pull/40120.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40120.patch",
"merged_at": null
} | # issue
when using `BART` w/ static cache using below code:
```
torch_dtype = torch.float16
device_map = "cpu"
model_kwargs = dict(torch_dtype=torch_dtype, device_map=device_map)
model_id = "sshleifer/distilbart-cnn-12-6"
tokenizer = AutoTokenizer.from_pretrained(model_id)
generator = pipeline(
"summarization",
model=model_id,
tokenizer=tokenizer,
model_kwargs=model_kwargs,
)
generation_config = generator.model.generation_config
generation_config.do_sample = args.do_sample
generation_config.use_cache = True
generation_config.temperature = 1.0
generation_config.num_beams = args.num_beams
generation_config.max_new_tokens = args.output_tokens
generation_config.min_new_tokens = args.output_tokens
generation_config.top_p = 1.0
generation_config.cache_implementation="static"
wrap_forward_for_benchmark(generator)
prompt = "I like math"
output = generator(
prompt , batch_size=1, generation_config=generation_config
)
```
there will be error as below:
> AttributeError: 'StaticLayer' object has no attribute 'max_batch_size'
# cause
The [PR](https://github.com/huggingface/transformers/commit/dc11a3cbb2c6cd96986519a144d4a22610fd8487#diff-357aea2fcf1b51b2a8ad17c7d30044159b1fe1a3f6bd832cced70aca84c8be66) removed `max_batch_size` from the constructor of `Cache` and make `max_batch_size` first initialized in `lazy_initialization`(which is called by first `update`), so before that we cannot get `max_batch_size` anymore.
# fix
check whether `max_batch_size` is there while check `need_new_cache`, and it works per this case.
@SunMarc, pls help review, thx very much. | {
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40120/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40120/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40119 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40119/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40119/comments | https://api.github.com/repos/huggingface/transformers/issues/40119/events | https://github.com/huggingface/transformers/pull/40119 | 3,315,988,397 | PR_kwDOCUB6oc6jW1Mh | 40,119 | Replace `self.tokenizer` by `self.processing_class` | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T21:22:34 | 2025-08-14T11:25:02 | 2025-08-14T11:24:55 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40119",
"html_url": "https://github.com/huggingface/transformers/pull/40119",
"diff_url": "https://github.com/huggingface/transformers/pull/40119.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40119.patch",
"merged_at": "2025-08-14T11:24:55"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40119/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40119/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40118 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40118/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40118/comments | https://api.github.com/repos/huggingface/transformers/issues/40118/events | https://github.com/huggingface/transformers/issues/40118 | 3,315,905,308 | I_kwDOCUB6oc7FpLMc | 40,118 | MT5: UnboundLocalError | {
"login": "kimihailv",
"id": 21249608,
"node_id": "MDQ6VXNlcjIxMjQ5NjA4",
"avatar_url": "https://avatars.githubusercontent.com/u/21249608?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kimihailv",
"html_url": "https://github.com/kimihailv",
"followers_url": "https://api.github.com/users/kimihailv/followers",
"following_url": "https://api.github.com/users/kimihailv/following{/other_user}",
"gists_url": "https://api.github.com/users/kimihailv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kimihailv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kimihailv/subscriptions",
"organizations_url": "https://api.github.com/users/kimihailv/orgs",
"repos_url": "https://api.github.com/users/kimihailv/repos",
"events_url": "https://api.github.com/users/kimihailv/events{/privacy}",
"received_events_url": "https://api.github.com/users/kimihailv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-12T20:54:11 | 2025-08-21T13:21:07 | 2025-08-18T14:18:35 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.55.0
- Platform: Linux-5.15.0-140-generic-x86_64-with-glibc2.35
- Python version: 3.11.13
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: 1.10.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
@ArthurZucker
When I call forward method of MT5 model I get this error:
```
lib/python3.11/site-packages/transformers/models/mt5/modeling_mt5.py", line 380, in forward
if is_cross_attention and past_key_value is not None and is_updated:
^^^^^^^^^^
UnboundLocalError: cannot access local variable 'is_updated' where it is not associated with a value
```
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Instantiate MT5 model
2. Make a forward pass.
### Expected behavior
Working forward method | {
"login": "kimihailv",
"id": 21249608,
"node_id": "MDQ6VXNlcjIxMjQ5NjA4",
"avatar_url": "https://avatars.githubusercontent.com/u/21249608?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kimihailv",
"html_url": "https://github.com/kimihailv",
"followers_url": "https://api.github.com/users/kimihailv/followers",
"following_url": "https://api.github.com/users/kimihailv/following{/other_user}",
"gists_url": "https://api.github.com/users/kimihailv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kimihailv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kimihailv/subscriptions",
"organizations_url": "https://api.github.com/users/kimihailv/orgs",
"repos_url": "https://api.github.com/users/kimihailv/repos",
"events_url": "https://api.github.com/users/kimihailv/events{/privacy}",
"received_events_url": "https://api.github.com/users/kimihailv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40118/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40117 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40117/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40117/comments | https://api.github.com/repos/huggingface/transformers/issues/40117/events | https://github.com/huggingface/transformers/pull/40117 | 3,315,580,867 | PR_kwDOCUB6oc6jVjs0 | 40,117 | Fix Idefics vision embedding mismatched devices | {
"login": "sayandipdutta",
"id": 11913249,
"node_id": "MDQ6VXNlcjExOTEzMjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11913249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sayandipdutta",
"html_url": "https://github.com/sayandipdutta",
"followers_url": "https://api.github.com/users/sayandipdutta/followers",
"following_url": "https://api.github.com/users/sayandipdutta/following{/other_user}",
"gists_url": "https://api.github.com/users/sayandipdutta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sayandipdutta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sayandipdutta/subscriptions",
"organizations_url": "https://api.github.com/users/sayandipdutta/orgs",
"repos_url": "https://api.github.com/users/sayandipdutta/repos",
"events_url": "https://api.github.com/users/sayandipdutta/events{/privacy}",
"received_events_url": "https://api.github.com/users/sayandipdutta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 7182115038,
"node_id": "LA_kwDOCUB6oc8AAAABrBZg3g",
"url": "https://api.github.com/repos/huggingface/transformers/labels/duplicate",
"name": "duplicate",
"color": "489FE4",
"default": true,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-12T19:25:33 | 2025-08-14T19:36:08 | 2025-08-13T13:36:50 | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40117",
"html_url": "https://github.com/huggingface/transformers/pull/40117",
"diff_url": "https://github.com/huggingface/transformers/pull/40117.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40117.patch",
"merged_at": null
} | # What does this PR do?
In the following section: https://github.com/huggingface/transformers/blob/a1a4fcd03e3455772415e6400fee91f3159e7ac5/src/transformers/models/idefics3/modeling_idefics3.py#L143-L160
`boundaries` and `position_ids` are getting created on `cpu`, and the `input` device information is not passed through. However, `patch_attention_mask` (input to the `forward` method) carries the `device` chosen at the call-site. While using device other than `cpu`, in the `for` loop, `p_attn_mask` and consequently `nb_patches_{h/w}` are on the input device. Making the `device` to `position_ids.device` on line `150-151` doesn't solve the issue, since they are already on `cpu`.
_**This PR moves `boundaries` and `position_ids` to `patch_attention_mask.device`.**_
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #40116
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "sayandipdutta",
"id": 11913249,
"node_id": "MDQ6VXNlcjExOTEzMjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11913249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sayandipdutta",
"html_url": "https://github.com/sayandipdutta",
"followers_url": "https://api.github.com/users/sayandipdutta/followers",
"following_url": "https://api.github.com/users/sayandipdutta/following{/other_user}",
"gists_url": "https://api.github.com/users/sayandipdutta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sayandipdutta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sayandipdutta/subscriptions",
"organizations_url": "https://api.github.com/users/sayandipdutta/orgs",
"repos_url": "https://api.github.com/users/sayandipdutta/repos",
"events_url": "https://api.github.com/users/sayandipdutta/events{/privacy}",
"received_events_url": "https://api.github.com/users/sayandipdutta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40117/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40117/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40116 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40116/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40116/comments | https://api.github.com/repos/huggingface/transformers/issues/40116/events | https://github.com/huggingface/transformers/issues/40116 | 3,315,522,112 | I_kwDOCUB6oc7FntpA | 40,116 | SmolVLM RuntimeError Expected all tensors to be on the same device, but found at least two devices | {
"login": "sayandipdutta",
"id": 11913249,
"node_id": "MDQ6VXNlcjExOTEzMjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11913249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sayandipdutta",
"html_url": "https://github.com/sayandipdutta",
"followers_url": "https://api.github.com/users/sayandipdutta/followers",
"following_url": "https://api.github.com/users/sayandipdutta/following{/other_user}",
"gists_url": "https://api.github.com/users/sayandipdutta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sayandipdutta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sayandipdutta/subscriptions",
"organizations_url": "https://api.github.com/users/sayandipdutta/orgs",
"repos_url": "https://api.github.com/users/sayandipdutta/repos",
"events_url": "https://api.github.com/users/sayandipdutta/events{/privacy}",
"received_events_url": "https://api.github.com/users/sayandipdutta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-12T19:11:22 | 2025-08-13T13:39:08 | 2025-08-13T13:39:08 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.0.dev0
- Platform: Linux-6.8.0-1033-gcp-x86_64-with-glibc2.35
- Python version: 3.12.9
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: 1.10.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA A100-SXM4-40GB
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Steps to reproduce:
1. Install transformers from `git`: `https://github.com/huggingface/transformers.git#9977cf17392fe5b821dc026ab76d3f5ed16e03f`
2. Install necessary dependencies to run the [example](https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct#how-to-get-started) at `HuggingFaceTB/SmolVLM2-2.2B-Instruct` model card.
3. Try to run the [Simple Inference example](https://huggingface.co/HuggingFaceTB/SmolVLM2-2.2B-Instruct#simple-inference).
Produces the following output:
```
You have video processor config saved in `preprocessor.json` file which is deprecated. Video processor configs should be saved in their own `video_preprocessor.json` file. You can rename t
he file or load and save the processor back which renames it automatically. Loading from `preprocessor.json` will be removed in v5.0.
Loading checkpoint shards: 100%|ΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûêΓûê| 2/2 [00:01<00:00, 1.96it/s]
Traceback (most recent call last):
File "/home/sayan/projects/argo/src/argo/extraction/app3.py", line 65, in <module>
generated_ids = model.generate(**inputs, do_sample=False, max_new_tokens=64)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/generation/utils.py", line 2542, in generate
result = self._sample(
^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/generation/utils.py", line 3523, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/utils/generic.py", line 959, in wrapper
output = func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 950, in forward
outputs = self.model(
^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/utils/generic.py", line 959, in wrapper
output = func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 763, in forward
image_hidden_states = self.get_image_features(pixel_values, pixel_attention_mask).to(inputs_embeds.device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 689, in get_image_features
image_hidden_states = self.vision_model(pixel_values=pixel_values, patch_attention_mask=patch_attention_mask)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 445, in forward
hidden_states = self.embeddings(pixel_values=pixel_values, patch_attention_mask=patch_attention_mask)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sayan/projects/argo/.venv/lib/python3.12/site-packages/transformers/models/smolvlm/modeling_smolvlm.py", line 148, in forward
fractional_coords_h = h_indices / nb_patches_h * (1 - 1e-6)
~~~~~~~~~~^~~~~~~~~~~~~~
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!
```
### Expected behavior
The model should run without any error, and produce the expected output (description of the given image).
#### Possible Cause:
I found PR #39981 which supposedly fixes this issue. But upon further inspection, I found that it is not carrying the device information from the inputs, and the device of `h_indices` is always `cpu`. This can be fixed by creating `boundaries` and `position_ids` with the input device. | {
"login": "sayandipdutta",
"id": 11913249,
"node_id": "MDQ6VXNlcjExOTEzMjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11913249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sayandipdutta",
"html_url": "https://github.com/sayandipdutta",
"followers_url": "https://api.github.com/users/sayandipdutta/followers",
"following_url": "https://api.github.com/users/sayandipdutta/following{/other_user}",
"gists_url": "https://api.github.com/users/sayandipdutta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sayandipdutta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sayandipdutta/subscriptions",
"organizations_url": "https://api.github.com/users/sayandipdutta/orgs",
"repos_url": "https://api.github.com/users/sayandipdutta/repos",
"events_url": "https://api.github.com/users/sayandipdutta/events{/privacy}",
"received_events_url": "https://api.github.com/users/sayandipdutta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40116/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40115 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40115/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40115/comments | https://api.github.com/repos/huggingface/transformers/issues/40115/events | https://github.com/huggingface/transformers/pull/40115 | 3,315,300,899 | PR_kwDOCUB6oc6jUqFL | 40,115 | [layer_types] update layer_types with conv | {
"login": "paulpak58",
"id": 52512091,
"node_id": "MDQ6VXNlcjUyNTEyMDkx",
"avatar_url": "https://avatars.githubusercontent.com/u/52512091?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/paulpak58",
"html_url": "https://github.com/paulpak58",
"followers_url": "https://api.github.com/users/paulpak58/followers",
"following_url": "https://api.github.com/users/paulpak58/following{/other_user}",
"gists_url": "https://api.github.com/users/paulpak58/gists{/gist_id}",
"starred_url": "https://api.github.com/users/paulpak58/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paulpak58/subscriptions",
"organizations_url": "https://api.github.com/users/paulpak58/orgs",
"repos_url": "https://api.github.com/users/paulpak58/repos",
"events_url": "https://api.github.com/users/paulpak58/events{/privacy}",
"received_events_url": "https://api.github.com/users/paulpak58/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-12T18:04:15 | 2025-08-12T18:05:12 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40115",
"html_url": "https://github.com/huggingface/transformers/pull/40115",
"diff_url": "https://github.com/huggingface/transformers/pull/40115.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40115.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40115/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40115/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40114 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40114/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40114/comments | https://api.github.com/repos/huggingface/transformers/issues/40114/events | https://github.com/huggingface/transformers/pull/40114 | 3,315,134,963 | PR_kwDOCUB6oc6jUHAz | 40,114 | Fix torch.export compatibility for Mixtral MoE models | {
"login": "akacmazz",
"id": 32853513,
"node_id": "MDQ6VXNlcjMyODUzNTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/32853513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akacmazz",
"html_url": "https://github.com/akacmazz",
"followers_url": "https://api.github.com/users/akacmazz/followers",
"following_url": "https://api.github.com/users/akacmazz/following{/other_user}",
"gists_url": "https://api.github.com/users/akacmazz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akacmazz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akacmazz/subscriptions",
"organizations_url": "https://api.github.com/users/akacmazz/orgs",
"repos_url": "https://api.github.com/users/akacmazz/repos",
"events_url": "https://api.github.com/users/akacmazz/events{/privacy}",
"received_events_url": "https://api.github.com/users/akacmazz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-12T17:09:50 | 2025-09-16T09:17:50 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40114",
"html_url": "https://github.com/huggingface/transformers/pull/40114",
"diff_url": "https://github.com/huggingface/transformers/pull/40114.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40114.patch",
"merged_at": null
} | - Replace data-dependent .nonzero() operation with static expert loop
- Resolves GuardOnDataDependentSymNode error during torch.export
- Maintains identical functionality while enabling export compatibility
- Fixes issue introduced in PR #32429
- Add tests for torch.export compatibility
# What does this PR do?
This PR fixes a torch.export compatibility issue #38518 with Mixtral MoE models that was introduced in PR #32429.
Problem
The optimization in PR #32429 introduced a .nonzero() operation that creates data-dependent tensor shapes, causing torch.export to fail with:
torch.fx.experimental.symbolic_shapes.GuardOnDataDependentSymNode: Could not extract specialized integer from data-dependent expression
Solution
Replace the dynamic expert selection loop:
expert_hit = torch.greater(expert_mask.sum(dim=(-1, -2)), 0).nonzero()
for expert_idx in expert_hit:
With a static loop over all experts:
for expert_idx in range(self.num_experts):
Impact
- ✅ Enables torch.export compatibility for Mixtral models
- ✅ Maintains identical functionality (empty experts contribute 0 naturally)
- ✅ Minimal performance impact (same computation, different loop structure)
- ✅ Consistent with other MoE implementations (Jamba, DBRX)
Testing
- Verified torch.export works without errors
- Confirmed functionality preservation with identical outputs
- Tested with various input configurations
Fixes torch.export compatibility issues reported for Mixtral-8x7B models.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
@Cyrilvallez
@ArthurZucker
@gante
| {
"login": "akacmazz",
"id": 32853513,
"node_id": "MDQ6VXNlcjMyODUzNTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/32853513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akacmazz",
"html_url": "https://github.com/akacmazz",
"followers_url": "https://api.github.com/users/akacmazz/followers",
"following_url": "https://api.github.com/users/akacmazz/following{/other_user}",
"gists_url": "https://api.github.com/users/akacmazz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akacmazz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akacmazz/subscriptions",
"organizations_url": "https://api.github.com/users/akacmazz/orgs",
"repos_url": "https://api.github.com/users/akacmazz/repos",
"events_url": "https://api.github.com/users/akacmazz/events{/privacy}",
"received_events_url": "https://api.github.com/users/akacmazz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40114/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40114/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40113 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40113/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40113/comments | https://api.github.com/repos/huggingface/transformers/issues/40113/events | https://github.com/huggingface/transformers/pull/40113 | 3,315,070,022 | PR_kwDOCUB6oc6jT5Ta | 40,113 | changed xLSTMRMSNorm to RMSNorm | {
"login": "nikitazuevblago",
"id": 115150359,
"node_id": "U_kgDOBt0OFw",
"avatar_url": "https://avatars.githubusercontent.com/u/115150359?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikitazuevblago",
"html_url": "https://github.com/nikitazuevblago",
"followers_url": "https://api.github.com/users/nikitazuevblago/followers",
"following_url": "https://api.github.com/users/nikitazuevblago/following{/other_user}",
"gists_url": "https://api.github.com/users/nikitazuevblago/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nikitazuevblago/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nikitazuevblago/subscriptions",
"organizations_url": "https://api.github.com/users/nikitazuevblago/orgs",
"repos_url": "https://api.github.com/users/nikitazuevblago/repos",
"events_url": "https://api.github.com/users/nikitazuevblago/events{/privacy}",
"received_events_url": "https://api.github.com/users/nikitazuevblago/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T16:45:54 | 2025-08-13T09:10:43 | 2025-08-13T09:10:42 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40113",
"html_url": "https://github.com/huggingface/transformers/pull/40113",
"diff_url": "https://github.com/huggingface/transformers/pull/40113.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40113.patch",
"merged_at": "2025-08-13T09:10:42"
} | xLSTMRMSNorm doesn't exist and hence gives an ImportError, without it users can't test NX-AI/xLSTM-7b.
cc @Cyrilvallez | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40113/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40113/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40112 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40112/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40112/comments | https://api.github.com/repos/huggingface/transformers/issues/40112/events | https://github.com/huggingface/transformers/pull/40112 | 3,315,042,733 | PR_kwDOCUB6oc6jTzpN | 40,112 | [serve] add cors warnings | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T16:35:31 | 2025-08-21T13:32:41 | 2025-08-21T13:32:36 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40112",
"html_url": "https://github.com/huggingface/transformers/pull/40112",
"diff_url": "https://github.com/huggingface/transformers/pull/40112.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40112.patch",
"merged_at": "2025-08-21T13:32:36"
} | # What does this PR do?
This PR adds:
- a warning about cors being not super safe (when it is enabled)
- a warning about downstream apps might needing CORS (when it is NOT enabled). Related to https://github.com/huggingface/transformers/issues/39932
<img width="731" height="111" alt="Screenshot 2025-08-12 at 17 34 20" src="https://github.com/user-attachments/assets/781ca31c-330c-4b75-9ea5-6fae33d7e831" />
| {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40112/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40112/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40111 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40111/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40111/comments | https://api.github.com/repos/huggingface/transformers/issues/40111/events | https://github.com/huggingface/transformers/issues/40111 | 3,315,042,464 | I_kwDOCUB6oc7Fl4ig | 40,111 | Wrong loss computation for VisionEncoderDecoderModel | {
"login": "tcnguyen",
"id": 10104641,
"node_id": "MDQ6VXNlcjEwMTA0NjQx",
"avatar_url": "https://avatars.githubusercontent.com/u/10104641?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tcnguyen",
"html_url": "https://github.com/tcnguyen",
"followers_url": "https://api.github.com/users/tcnguyen/followers",
"following_url": "https://api.github.com/users/tcnguyen/following{/other_user}",
"gists_url": "https://api.github.com/users/tcnguyen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tcnguyen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tcnguyen/subscriptions",
"organizations_url": "https://api.github.com/users/tcnguyen/orgs",
"repos_url": "https://api.github.com/users/tcnguyen/repos",
"events_url": "https://api.github.com/users/tcnguyen/events{/privacy}",
"received_events_url": "https://api.github.com/users/tcnguyen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | open | false | null | [] | null | [] | 2025-08-12T16:35:25 | 2025-10-09T04:58:39 | null | NONE | null | null | null | null | ### System Info
Hi team, in this:
https://github.com/huggingface/transformers/blob/main/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py#L550
The `decoder_input_ids` is already shifted right w.r.t the labels. but then in the ForCausalLMLoss loss, the labels is also shifted left, which caused the problem.
This happened since 4.50.0 version with this commit: https://github.com/huggingface/transformers/pull/36753/files#diff-d6b5547bc8ece59fd6c96645e48cce4f32d0995a6621b57f896ad800168017bdL641
thank you
cc @qubvel
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When training a TrOCR model following this https://huggingface.co/docs/transformers/v4.53.3/en/model_doc/vision-encoder-decoder#transformers.VisionEncoderDecoderModel.forward.example, the loss is not converging well for transformers==4.50, while it did with transformers==4.49
### Expected behavior
we can set the `shift_labels = labels `in the case `decoder_input_ids `is None and pass it to the loss function | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40111/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40111/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40110 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40110/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40110/comments | https://api.github.com/repos/huggingface/transformers/issues/40110/events | https://github.com/huggingface/transformers/issues/40110 | 3,314,820,756 | I_kwDOCUB6oc7FlCaU | 40,110 | Upgrading transformers from 4.51.3 to 4.55 makes GRPOTrainer fail training when used with accelerate | {
"login": "Randomdude11",
"id": 37081069,
"node_id": "MDQ6VXNlcjM3MDgxMDY5",
"avatar_url": "https://avatars.githubusercontent.com/u/37081069?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Randomdude11",
"html_url": "https://github.com/Randomdude11",
"followers_url": "https://api.github.com/users/Randomdude11/followers",
"following_url": "https://api.github.com/users/Randomdude11/following{/other_user}",
"gists_url": "https://api.github.com/users/Randomdude11/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Randomdude11/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Randomdude11/subscriptions",
"organizations_url": "https://api.github.com/users/Randomdude11/orgs",
"repos_url": "https://api.github.com/users/Randomdude11/repos",
"events_url": "https://api.github.com/users/Randomdude11/events{/privacy}",
"received_events_url": "https://api.github.com/users/Randomdude11/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T15:30:59 | 2025-09-21T08:02:16 | 2025-09-21T08:02:16 | NONE | null | null | null | null | I have been battling a strange bug the past few days - I have a GRPOTrainer script which works perfectly fine when ran without accelerate and trains a very well performing model, but when using accelerate, the model's performance quickly deteriorates and no training takes place.
I have tried many things, but I am not sure what causes this issue. The only fix I have is to downgrade from transformers 4.55 to transformers 4.51.3, and suddenly everything works fine with or without accelerate.
I would really like to use the latest version of TRL because there, GRPOTrainer supports training for VLMs, but with this bug, I am unable to get anything done quickly enough.
I would appreciate any form of input that would help me pin this down and use the latest trl/transformers versions.
Thanks! | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40110/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40110/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40109 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40109/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40109/comments | https://api.github.com/repos/huggingface/transformers/issues/40109/events | https://github.com/huggingface/transformers/pull/40109 | 3,314,669,207 | PR_kwDOCUB6oc6jSjRg | 40,109 | Fix QuantoQuantizedCache import issues | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T14:54:51 | 2025-08-13T10:23:00 | 2025-08-13T10:23:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40109",
"html_url": "https://github.com/huggingface/transformers/pull/40109",
"diff_url": "https://github.com/huggingface/transformers/pull/40109.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40109.patch",
"merged_at": "2025-08-13T10:22:59"
} | Revert some changes from #39797 to fix QuantoQuantizedCache.
cc @Cyrilvallez
Fixes #40099 | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40109/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40109/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40108 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40108/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40108/comments | https://api.github.com/repos/huggingface/transformers/issues/40108/events | https://github.com/huggingface/transformers/pull/40108 | 3,314,331,656 | PR_kwDOCUB6oc6jRaAK | 40,108 | [WIP] Add PLDR-LLM | {
"login": "burcgokden",
"id": 50996809,
"node_id": "MDQ6VXNlcjUwOTk2ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/50996809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/burcgokden",
"html_url": "https://github.com/burcgokden",
"followers_url": "https://api.github.com/users/burcgokden/followers",
"following_url": "https://api.github.com/users/burcgokden/following{/other_user}",
"gists_url": "https://api.github.com/users/burcgokden/gists{/gist_id}",
"starred_url": "https://api.github.com/users/burcgokden/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/burcgokden/subscriptions",
"organizations_url": "https://api.github.com/users/burcgokden/orgs",
"repos_url": "https://api.github.com/users/burcgokden/repos",
"events_url": "https://api.github.com/users/burcgokden/events{/privacy}",
"received_events_url": "https://api.github.com/users/burcgokden/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T13:42:00 | 2025-09-01T13:32:31 | 2025-09-01T13:32:31 | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40108",
"html_url": "https://github.com/huggingface/transformers/pull/40108",
"diff_url": "https://github.com/huggingface/transformers/pull/40108.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40108.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
This PR is for adding a new model: PLDR-LLM (Large Language Model from Power Law Decoder Representation)
https://github.com/huggingface/transformers/issues/40101
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
https://github.com/huggingface/transformers/issues/40101
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "burcgokden",
"id": 50996809,
"node_id": "MDQ6VXNlcjUwOTk2ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/50996809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/burcgokden",
"html_url": "https://github.com/burcgokden",
"followers_url": "https://api.github.com/users/burcgokden/followers",
"following_url": "https://api.github.com/users/burcgokden/following{/other_user}",
"gists_url": "https://api.github.com/users/burcgokden/gists{/gist_id}",
"starred_url": "https://api.github.com/users/burcgokden/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/burcgokden/subscriptions",
"organizations_url": "https://api.github.com/users/burcgokden/orgs",
"repos_url": "https://api.github.com/users/burcgokden/repos",
"events_url": "https://api.github.com/users/burcgokden/events{/privacy}",
"received_events_url": "https://api.github.com/users/burcgokden/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40108/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40107 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40107/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40107/comments | https://api.github.com/repos/huggingface/transformers/issues/40107/events | https://github.com/huggingface/transformers/pull/40107 | 3,314,213,587 | PR_kwDOCUB6oc6jRAFP | 40,107 | Fix various Pylint warnings | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T13:14:16 | 2025-08-15T13:03:08 | 2025-08-15T12:40:12 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40107",
"html_url": "https://github.com/huggingface/transformers/pull/40107",
"diff_url": "https://github.com/huggingface/transformers/pull/40107.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40107.patch",
"merged_at": "2025-08-15T12:40:12"
} | # What does this PR do?
Fix various Pylint warnings about unused comments, bad indentation and imports. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40107/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40106 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40106/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40106/comments | https://api.github.com/repos/huggingface/transformers/issues/40106/events | https://github.com/huggingface/transformers/pull/40106 | 3,314,160,480 | PR_kwDOCUB6oc6jQ0hF | 40,106 | Re-apply make style | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T13:01:53 | 2025-08-12T13:15:21 | 2025-08-12T13:02:17 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40106",
"html_url": "https://github.com/huggingface/transformers/pull/40106",
"diff_url": "https://github.com/huggingface/transformers/pull/40106.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40106.patch",
"merged_at": "2025-08-12T13:02:17"
} | # What does this PR do?
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40106/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40106/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40105 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40105/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40105/comments | https://api.github.com/repos/huggingface/transformers/issues/40105/events | https://github.com/huggingface/transformers/issues/40105 | 3,314,130,303 | I_kwDOCUB6oc7FiZ1_ | 40,105 | Docs: Add missing `eli5` dataset example in Language Modeling guide | {
"login": "sdivyanshu90",
"id": 97019230,
"node_id": "U_kgDOBchlXg",
"avatar_url": "https://avatars.githubusercontent.com/u/97019230?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sdivyanshu90",
"html_url": "https://github.com/sdivyanshu90",
"followers_url": "https://api.github.com/users/sdivyanshu90/followers",
"following_url": "https://api.github.com/users/sdivyanshu90/following{/other_user}",
"gists_url": "https://api.github.com/users/sdivyanshu90/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sdivyanshu90/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sdivyanshu90/subscriptions",
"organizations_url": "https://api.github.com/users/sdivyanshu90/orgs",
"repos_url": "https://api.github.com/users/sdivyanshu90/repos",
"events_url": "https://api.github.com/users/sdivyanshu90/events{/privacy}",
"received_events_url": "https://api.github.com/users/sdivyanshu90/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T12:54:50 | 2025-09-21T08:02:18 | 2025-09-21T08:02:18 | NONE | null | null | null | null | **Location in docs**
https://huggingface.co/docs/transformers/tasks/language_modeling#load-eli5-dataset
**Description**
In the "Language Modeling" guide, the example shows:
```python
from datasets import load_dataset
dataset = load_dataset("eli5_category", split="train") | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40105/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40104 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40104/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40104/comments | https://api.github.com/repos/huggingface/transformers/issues/40104/events | https://github.com/huggingface/transformers/issues/40104 | 3,313,719,587 | I_kwDOCUB6oc7Fg1kj | 40,104 | router_logits not getting populated on output_router_logits=True for GptOssForCausalLM | {
"login": "qrdlgit",
"id": 129564070,
"node_id": "U_kgDOB7j9pg",
"avatar_url": "https://avatars.githubusercontent.com/u/129564070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qrdlgit",
"html_url": "https://github.com/qrdlgit",
"followers_url": "https://api.github.com/users/qrdlgit/followers",
"following_url": "https://api.github.com/users/qrdlgit/following{/other_user}",
"gists_url": "https://api.github.com/users/qrdlgit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qrdlgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qrdlgit/subscriptions",
"organizations_url": "https://api.github.com/users/qrdlgit/orgs",
"repos_url": "https://api.github.com/users/qrdlgit/repos",
"events_url": "https://api.github.com/users/qrdlgit/events{/privacy}",
"received_events_url": "https://api.github.com/users/qrdlgit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-12T11:02:43 | 2025-10-15T08:03:27 | 2025-10-15T08:03:27 | NONE | null | null | null | null | ### System Info
On kaggle - https://www.kaggle.com/code/kaggleqrdl/router-logits-visualization?scriptVersionId=255623848 see https://cookbook.openai.com/articles/gpt-oss/run-transformers and https://github.com/huggingface/transformers/pull/39940
- `transformers` version: 4.56.0.dev0
- Platform: Linux-6.6.56+-x86_64-with-glibc2.35
- Python version: 3.11.13
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.3
- Accelerate version: 1.8.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu128 (CUDA)
- Tensorflow version (GPU?): 2.18.0 (True)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (gpu)
- Jax version: 0.5.2
- JaxLib version: 0.5.1
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: Tesla T4
### Who can help?
@SunMarc @MekkCyber (tagging as they are in the kernels pr for this model)
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When I pass output_router_logits=True to out=model(.., the out.router_logits is none. I believe the issue is a combination of having kernels installed and being used and https://github.com/huggingface/transformers/blob/913c0a8c334afcbb42d8ee74567ff9ed0344b11d/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L394C67-L394C68
Any brief comments on the underlying technical solution to this problem would be greatly appreciated! I have a workaround of sorts in https://www.kaggle.com/code/kaggleqrdl/router-logits-visualization?scriptVersionId=255623848 and comments would be helpful given the tight timeline. Also note the code for https://github.com/huggingface/transformers/blob/913c0a8c334afcbb42d8ee74567ff9ed0344b11d/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L685
I nulled that out as we're not training for this task.
### Expected behavior
out.router_logits should be populated via the OutputRecorder when kernels is used | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40104/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40104/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40103 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40103/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40103/comments | https://api.github.com/repos/huggingface/transformers/issues/40103/events | https://github.com/huggingface/transformers/pull/40103 | 3,313,672,804 | PR_kwDOCUB6oc6jPKOu | 40,103 | [Cohere2Vision] remove unused arg | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T10:49:37 | 2025-08-14T09:10:25 | 2025-08-14T09:10:25 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40103",
"html_url": "https://github.com/huggingface/transformers/pull/40103",
"diff_url": "https://github.com/huggingface/transformers/pull/40103.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40103.patch",
"merged_at": "2025-08-14T09:10:25"
} | # What does this PR do?
Cohere2 Vision doesn't use image patches anymore, but the arg was not removed. The image processor returns `num_patches` which is always popped when processing
We need to remove unused args to make sure vLLM backend works, otherwise an error is raised about missing required args
cc @hmellor
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40103/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40103/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40102 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40102/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40102/comments | https://api.github.com/repos/huggingface/transformers/issues/40102/events | https://github.com/huggingface/transformers/pull/40102 | 3,313,589,993 | PR_kwDOCUB6oc6jO3_w | 40,102 | 🌐 [i18n-KO] Translated `auto_docstring.md` to Korean | {
"login": "chelsseeey",
"id": 152389483,
"node_id": "U_kgDOCRVHaw",
"avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chelsseeey",
"html_url": "https://github.com/chelsseeey",
"followers_url": "https://api.github.com/users/chelsseeey/followers",
"following_url": "https://api.github.com/users/chelsseeey/following{/other_user}",
"gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions",
"organizations_url": "https://api.github.com/users/chelsseeey/orgs",
"repos_url": "https://api.github.com/users/chelsseeey/repos",
"events_url": "https://api.github.com/users/chelsseeey/events{/privacy}",
"received_events_url": "https://api.github.com/users/chelsseeey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-12T10:25:08 | 2025-08-12T16:43:18 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40102",
"html_url": "https://github.com/huggingface/transformers/pull/40102",
"diff_url": "https://github.com/huggingface/transformers/pull/40102.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40102.patch",
"merged_at": null
} | # What does this PR do?
Translated the `auto_docstring.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [X] Check for missing / redundant translations (번역 누락/중복 검사)
- [X] Grammar Check (맞춤법 검사)
- [X] Review or Add new terms to glossary (용어 확인 및 추가)
- [X] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [X] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
<!-- @yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40102/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40102/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40101 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40101/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40101/comments | https://api.github.com/repos/huggingface/transformers/issues/40101/events | https://github.com/huggingface/transformers/issues/40101 | 3,313,491,350 | I_kwDOCUB6oc7Ff92W | 40,101 | [WIP] Add PLDR-LLM | {
"login": "burcgokden",
"id": 50996809,
"node_id": "MDQ6VXNlcjUwOTk2ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/50996809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/burcgokden",
"html_url": "https://github.com/burcgokden",
"followers_url": "https://api.github.com/users/burcgokden/followers",
"following_url": "https://api.github.com/users/burcgokden/following{/other_user}",
"gists_url": "https://api.github.com/users/burcgokden/gists{/gist_id}",
"starred_url": "https://api.github.com/users/burcgokden/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/burcgokden/subscriptions",
"organizations_url": "https://api.github.com/users/burcgokden/orgs",
"repos_url": "https://api.github.com/users/burcgokden/repos",
"events_url": "https://api.github.com/users/burcgokden/events{/privacy}",
"received_events_url": "https://api.github.com/users/burcgokden/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-12T09:58:51 | 2025-09-01T13:34:59 | 2025-09-01T13:34:59 | NONE | null | null | null | null | ### Model description
[PLDR-LLM](https://huggingface.co/papers/2410.16703) (Large Language Model from Power Law Decoder Representations) is a new foundational model that leverages non-linear and linear transformations through [Power Law Graph Attention](https://huggingface.co/papers/2107.02039) mechanism to generate well-defined deductive and inductive outputs. While the inductive outputs provide next-token prediction output in the same way as LLMs with Scaled Dot-Product Attention, the deductive outputs provide ways to observe and regularize the attention mechanism. A notable characteristic of PLDR-LLM is that it can learn a singular rank-1 matrix which is [a generalizable tensor operator](https://huggingface.co/papers/2502.13502) that can replace its own deep neural net at inference.
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
**original implementation:**
- https://github.com/burcgokden/PLDR-LLM-with-KVG-cache
**author:**
- @burcgokden
**model weights:**
- https://huggingface.co/fromthesky
**papers:**
- [PLDR-LLMs Learn A Generalizable Tensor Operator That Can Replace Its Own Deep Neural Net At Inference](https://huggingface.co/papers/2502.13502)
- [PLDR-LLM: Large Language Model from Power Law Decoder Representations](https://huggingface.co/papers/2410.16703)
- [Power Law Graph Transformer for Machine Translation and Representation Learning](https://huggingface.co/papers/2107.02039) | {
"login": "burcgokden",
"id": 50996809,
"node_id": "MDQ6VXNlcjUwOTk2ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/50996809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/burcgokden",
"html_url": "https://github.com/burcgokden",
"followers_url": "https://api.github.com/users/burcgokden/followers",
"following_url": "https://api.github.com/users/burcgokden/following{/other_user}",
"gists_url": "https://api.github.com/users/burcgokden/gists{/gist_id}",
"starred_url": "https://api.github.com/users/burcgokden/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/burcgokden/subscriptions",
"organizations_url": "https://api.github.com/users/burcgokden/orgs",
"repos_url": "https://api.github.com/users/burcgokden/repos",
"events_url": "https://api.github.com/users/burcgokden/events{/privacy}",
"received_events_url": "https://api.github.com/users/burcgokden/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40101/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40100 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40100/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40100/comments | https://api.github.com/repos/huggingface/transformers/issues/40100/events | https://github.com/huggingface/transformers/pull/40100 | 3,313,483,066 | PR_kwDOCUB6oc6jOgev | 40,100 | Switch the order of args in StaticCache (for BC and future logic) | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T09:56:42 | 2025-08-12T13:30:47 | 2025-08-12T13:30:45 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40100",
"html_url": "https://github.com/huggingface/transformers/pull/40100",
"diff_url": "https://github.com/huggingface/transformers/pull/40100.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40100.patch",
"merged_at": "2025-08-12T13:30:45"
} | # What does this PR do?
As per the title. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40100/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40100/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40099 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40099/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40099/comments | https://api.github.com/repos/huggingface/transformers/issues/40099/events | https://github.com/huggingface/transformers/issues/40099 | 3,313,480,201 | I_kwDOCUB6oc7Ff7IJ | 40,099 | QuantoQuantizedCache is not working | {
"login": "maxjeblick",
"id": 24281881,
"node_id": "MDQ6VXNlcjI0MjgxODgx",
"avatar_url": "https://avatars.githubusercontent.com/u/24281881?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/maxjeblick",
"html_url": "https://github.com/maxjeblick",
"followers_url": "https://api.github.com/users/maxjeblick/followers",
"following_url": "https://api.github.com/users/maxjeblick/following{/other_user}",
"gists_url": "https://api.github.com/users/maxjeblick/gists{/gist_id}",
"starred_url": "https://api.github.com/users/maxjeblick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/maxjeblick/subscriptions",
"organizations_url": "https://api.github.com/users/maxjeblick/orgs",
"repos_url": "https://api.github.com/users/maxjeblick/repos",
"events_url": "https://api.github.com/users/maxjeblick/events{/privacy}",
"received_events_url": "https://api.github.com/users/maxjeblick/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-12T09:55:54 | 2025-08-13T10:23:01 | 2025-08-13T10:23:01 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.0.dev0
- Platform: Linux-6.14.0-27-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.6.1
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: Yes
- GPU type: NVIDIA RTX A6000
### Who can help?
@manueldeprada @SunMarc @MekkCyber
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I'm using the ame script as in the `QuantoQuantizedCache` docstring
```
from transformers import AutoTokenizer, AutoModelForCausalLM, QuantoQuantizedCache
model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2-0.5B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2-0.5B-Instruct")
inputs = tokenizer(text="My name is Qwen2", return_tensors="pt")
past_key_values = QuantoQuantizedCache(config=model.config, nbits=4)
outputs = model(**inputs, past_key_values=past_key_values, use_cache=True)
outputs.past_key_values
```
which yields to
```
Traceback (most recent call last):
File "/home/mjeblick/.config/JetBrains/PyCharm2025.2/scratches/scratch_41.py", line 8, in <module>
past_key_values = QuantoQuantizedCache(config=model.config, nbits=4)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/2tb/PyCharmProjects/kvpress/.venv/lib/python3.12/site-packages/transformers/cache_utils.py", line 1305, in __init__
super().__init__("quanto", config, nbits, axis_key, axis_value, q_group_size, residual_length)
File "/mnt/2tb/PyCharmProjects/kvpress/.venv/lib/python3.12/site-packages/transformers/cache_utils.py", line 1257, in __init__
layer_class(nbits, axis_key, axis_value, q_group_size, residual_length)
File "/mnt/2tb/PyCharmProjects/kvpress/.venv/lib/python3.12/site-packages/transformers/cache_utils.py", line 587, in __init__
super().__init__(
File "/mnt/2tb/PyCharmProjects/kvpress/.venv/lib/python3.12/site-packages/transformers/cache_utils.py", line 515, in __init__
super().__init__(self)
TypeError: CacheLayerMixin.__init__() takes 1 positional argument but 2 were given
```
(I'm using optimum_quanto-0.2.7, I don't think, however, the version matters here, as it seems to be an inheritance issue).
### Expected behavior
`QuantoQuantizedCache` works as expected. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40099/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40098 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40098/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40098/comments | https://api.github.com/repos/huggingface/transformers/issues/40098/events | https://github.com/huggingface/transformers/issues/40098 | 3,313,400,953 | I_kwDOCUB6oc7Ffnx5 | 40,098 | Different outputs between using input_embeds and using input_ids | {
"login": "HCRTY",
"id": 70944180,
"node_id": "MDQ6VXNlcjcwOTQ0MTgw",
"avatar_url": "https://avatars.githubusercontent.com/u/70944180?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HCRTY",
"html_url": "https://github.com/HCRTY",
"followers_url": "https://api.github.com/users/HCRTY/followers",
"following_url": "https://api.github.com/users/HCRTY/following{/other_user}",
"gists_url": "https://api.github.com/users/HCRTY/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HCRTY/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HCRTY/subscriptions",
"organizations_url": "https://api.github.com/users/HCRTY/orgs",
"repos_url": "https://api.github.com/users/HCRTY/repos",
"events_url": "https://api.github.com/users/HCRTY/events{/privacy}",
"received_events_url": "https://api.github.com/users/HCRTY/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-12T09:36:15 | 2025-10-24T08:02:44 | null | NONE | null | null | null | null | I trained qwen2.5 with a TTS task. When I do inference, I tried 2 different input type: input_ids and input_embeds. However, I got different outputs from the same model. I wonder whether this is expected? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40098/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40098/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40097 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40097/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40097/comments | https://api.github.com/repos/huggingface/transformers/issues/40097/events | https://github.com/huggingface/transformers/issues/40097 | 3,313,387,443 | I_kwDOCUB6oc7Ffkez | 40,097 | Wrong function signature introduced in PR #40029 breaking flash attention | {
"login": "maxjeblick",
"id": 24281881,
"node_id": "MDQ6VXNlcjI0MjgxODgx",
"avatar_url": "https://avatars.githubusercontent.com/u/24281881?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/maxjeblick",
"html_url": "https://github.com/maxjeblick",
"followers_url": "https://api.github.com/users/maxjeblick/followers",
"following_url": "https://api.github.com/users/maxjeblick/following{/other_user}",
"gists_url": "https://api.github.com/users/maxjeblick/gists{/gist_id}",
"starred_url": "https://api.github.com/users/maxjeblick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/maxjeblick/subscriptions",
"organizations_url": "https://api.github.com/users/maxjeblick/orgs",
"repos_url": "https://api.github.com/users/maxjeblick/repos",
"events_url": "https://api.github.com/users/maxjeblick/events{/privacy}",
"received_events_url": "https://api.github.com/users/maxjeblick/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-12T09:32:55 | 2025-08-12T15:05:32 | 2025-08-12T15:05:32 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.56.0.dev0
- Platform: Linux-6.14.0-27-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.6.1
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: Yes, with flash-attention
- GPU type: NVIDIA RTX A6000
### Who can help?
@zucchini-nlp
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
`prepare_fa_kwargs_from_position_ids` returns `(cu_seq_lens_q, cu_seq_lens_k), (max_length_q, max_length_k)` (a tuple of length 2, but `_prepare_from_posids` expects this to be `cu_seq_lens_q, cu_seq_lens_k, max_length_q, max_length_k`.
Bug may have been introduced in #40029
<img width="2179" height="1850" alt="Image" src="https://github.com/user-attachments/assets/6e464691-0de5-4ff2-a2a9-4595d0fa9513" />
### Expected behavior
No error thrown | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40097/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40097/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40096 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40096/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40096/comments | https://api.github.com/repos/huggingface/transformers/issues/40096/events | https://github.com/huggingface/transformers/pull/40096 | 3,313,345,386 | PR_kwDOCUB6oc6jOCVY | 40,096 | [Update] Zero Shot Object Detection Task | {
"login": "ariG23498",
"id": 36856589,
"node_id": "MDQ6VXNlcjM2ODU2NTg5",
"avatar_url": "https://avatars.githubusercontent.com/u/36856589?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ariG23498",
"html_url": "https://github.com/ariG23498",
"followers_url": "https://api.github.com/users/ariG23498/followers",
"following_url": "https://api.github.com/users/ariG23498/following{/other_user}",
"gists_url": "https://api.github.com/users/ariG23498/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ariG23498/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ariG23498/subscriptions",
"organizations_url": "https://api.github.com/users/ariG23498/orgs",
"repos_url": "https://api.github.com/users/ariG23498/repos",
"events_url": "https://api.github.com/users/ariG23498/events{/privacy}",
"received_events_url": "https://api.github.com/users/ariG23498/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T09:22:05 | 2025-08-12T10:43:38 | 2025-08-12T10:43:38 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40096",
"html_url": "https://github.com/huggingface/transformers/pull/40096",
"diff_url": "https://github.com/huggingface/transformers/pull/40096.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40096.patch",
"merged_at": "2025-08-12T10:43:38"
} | CC: @qubvel | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40096/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40096/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40095 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40095/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40095/comments | https://api.github.com/repos/huggingface/transformers/issues/40095/events | https://github.com/huggingface/transformers/pull/40095 | 3,313,133,495 | PR_kwDOCUB6oc6jNWhq | 40,095 | Add glm4.5&&glm4.5V doc | {
"login": "lambertwjh",
"id": 148857096,
"node_id": "U_kgDOCN9hCA",
"avatar_url": "https://avatars.githubusercontent.com/u/148857096?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lambertwjh",
"html_url": "https://github.com/lambertwjh",
"followers_url": "https://api.github.com/users/lambertwjh/followers",
"following_url": "https://api.github.com/users/lambertwjh/following{/other_user}",
"gists_url": "https://api.github.com/users/lambertwjh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lambertwjh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lambertwjh/subscriptions",
"organizations_url": "https://api.github.com/users/lambertwjh/orgs",
"repos_url": "https://api.github.com/users/lambertwjh/repos",
"events_url": "https://api.github.com/users/lambertwjh/events{/privacy}",
"received_events_url": "https://api.github.com/users/lambertwjh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T08:29:04 | 2025-08-12T11:45:14 | 2025-08-12T11:44:53 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40095",
"html_url": "https://github.com/huggingface/transformers/pull/40095",
"diff_url": "https://github.com/huggingface/transformers/pull/40095.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40095.patch",
"merged_at": "2025-08-12T11:44:53"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40095/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40095/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40094 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40094/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40094/comments | https://api.github.com/repos/huggingface/transformers/issues/40094/events | https://github.com/huggingface/transformers/pull/40094 | 3,313,063,205 | PR_kwDOCUB6oc6jNH4F | 40,094 | Update: add type hints to check_tokenizers.py | {
"login": "ajeet214",
"id": 31981999,
"node_id": "MDQ6VXNlcjMxOTgxOTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/31981999?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ajeet214",
"html_url": "https://github.com/ajeet214",
"followers_url": "https://api.github.com/users/ajeet214/followers",
"following_url": "https://api.github.com/users/ajeet214/following{/other_user}",
"gists_url": "https://api.github.com/users/ajeet214/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ajeet214/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ajeet214/subscriptions",
"organizations_url": "https://api.github.com/users/ajeet214/orgs",
"repos_url": "https://api.github.com/users/ajeet214/repos",
"events_url": "https://api.github.com/users/ajeet214/events{/privacy}",
"received_events_url": "https://api.github.com/users/ajeet214/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T08:09:56 | 2025-08-15T12:41:28 | 2025-08-15T12:41:28 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40094",
"html_url": "https://github.com/huggingface/transformers/pull/40094",
"diff_url": "https://github.com/huggingface/transformers/pull/40094.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40094.patch",
"merged_at": "2025-08-15T12:41:28"
} | chore(typing): add type hints to scripts/check_tokenizers.py and use PreTrainedTokenizerBase
# What does this PR do?
Adds lightweight type hints to `scripts/check_tokenizers.py`:
- Annotate params/returns for helper functions: `check_diff`, `check_LTR_mark`, `check_details`, `test_string`, `test_tokenizer`
- Uses `transformers.tokenization_utils_base.PreTrainedTokenizerBase` for `slow`/`fast` instead of `Any`
- Make `check_LTR_mark` return `bool` explicitly (no behavior change)
This improves readability, IDE support, and static analysis (mypy/pyright) without changing runtime behavior.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
- [x] No new tests are necessary (non-functional typing change).
- [x] This change does not require documentation updates.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40094/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40094/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40093 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40093/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40093/comments | https://api.github.com/repos/huggingface/transformers/issues/40093/events | https://github.com/huggingface/transformers/pull/40093 | 3,312,452,275 | PR_kwDOCUB6oc6jLFol | 40,093 | fix(modeling_utils): correct initialization of missing and mismatched… | {
"login": "MengAiDev",
"id": 202287492,
"node_id": "U_kgDODA6phA",
"avatar_url": "https://avatars.githubusercontent.com/u/202287492?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MengAiDev",
"html_url": "https://github.com/MengAiDev",
"followers_url": "https://api.github.com/users/MengAiDev/followers",
"following_url": "https://api.github.com/users/MengAiDev/following{/other_user}",
"gists_url": "https://api.github.com/users/MengAiDev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MengAiDev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MengAiDev/subscriptions",
"organizations_url": "https://api.github.com/users/MengAiDev/orgs",
"repos_url": "https://api.github.com/users/MengAiDev/repos",
"events_url": "https://api.github.com/users/MengAiDev/events{/privacy}",
"received_events_url": "https://api.github.com/users/MengAiDev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T04:12:13 | 2025-09-06T02:46:00 | 2025-09-06T02:46:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40093",
"html_url": "https://github.com/huggingface/transformers/pull/40093",
"diff_url": "https://github.com/huggingface/transformers/pull/40093.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40093.patch",
"merged_at": null
} |
- Update the _initialize_missing_keys method to handle both missing and mismatched keys
- Ensure proper initialization of weights when loading pretrained models
Fixes #40001
@ArthurZucker
| {
"login": "MengAiDev",
"id": 202287492,
"node_id": "U_kgDODA6phA",
"avatar_url": "https://avatars.githubusercontent.com/u/202287492?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MengAiDev",
"html_url": "https://github.com/MengAiDev",
"followers_url": "https://api.github.com/users/MengAiDev/followers",
"following_url": "https://api.github.com/users/MengAiDev/following{/other_user}",
"gists_url": "https://api.github.com/users/MengAiDev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MengAiDev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MengAiDev/subscriptions",
"organizations_url": "https://api.github.com/users/MengAiDev/orgs",
"repos_url": "https://api.github.com/users/MengAiDev/repos",
"events_url": "https://api.github.com/users/MengAiDev/events{/privacy}",
"received_events_url": "https://api.github.com/users/MengAiDev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40093/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40093/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40092 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40092/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40092/comments | https://api.github.com/repos/huggingface/transformers/issues/40092/events | https://github.com/huggingface/transformers/pull/40092 | 3,312,393,856 | PR_kwDOCUB6oc6jK5N8 | 40,092 | Optimize LlamaAttention by fusing QKV projections | {
"login": "null-pointer-access",
"id": 210762976,
"node_id": "U_kgDODI_84A",
"avatar_url": "https://avatars.githubusercontent.com/u/210762976?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/null-pointer-access",
"html_url": "https://github.com/null-pointer-access",
"followers_url": "https://api.github.com/users/null-pointer-access/followers",
"following_url": "https://api.github.com/users/null-pointer-access/following{/other_user}",
"gists_url": "https://api.github.com/users/null-pointer-access/gists{/gist_id}",
"starred_url": "https://api.github.com/users/null-pointer-access/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/null-pointer-access/subscriptions",
"organizations_url": "https://api.github.com/users/null-pointer-access/orgs",
"repos_url": "https://api.github.com/users/null-pointer-access/repos",
"events_url": "https://api.github.com/users/null-pointer-access/events{/privacy}",
"received_events_url": "https://api.github.com/users/null-pointer-access/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-12T03:34:26 | 2025-08-20T12:27:50 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40092",
"html_url": "https://github.com/huggingface/transformers/pull/40092",
"diff_url": "https://github.com/huggingface/transformers/pull/40092.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40092.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
This PR fuses the q, k, and v projections in LlamaAttention to a single qkv projection. This can improve the efficiency and GPU utilization when the batch size is small. Specifically, when running a single prefill with `seq_len=128` on 1xH100, it achieves 28% efficiency improvement.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40092/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40092/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40091 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40091/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40091/comments | https://api.github.com/repos/huggingface/transformers/issues/40091/events | https://github.com/huggingface/transformers/pull/40091 | 3,312,335,041 | PR_kwDOCUB6oc6jKs3Y | 40,091 | Replace `logger.warning` with `logger.warning_once` in `GradientCheckpointingLayer` | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-12T03:01:59 | 2025-08-12T13:26:50 | 2025-08-12T13:26:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40091",
"html_url": "https://github.com/huggingface/transformers/pull/40091",
"diff_url": "https://github.com/huggingface/transformers/pull/40091.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40091.patch",
"merged_at": "2025-08-12T13:26:47"
} | You get thousands of this warning during training otherwise | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40091/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40090 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40090/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40090/comments | https://api.github.com/repos/huggingface/transformers/issues/40090/events | https://github.com/huggingface/transformers/pull/40090 | 3,311,890,576 | PR_kwDOCUB6oc6jJQUy | 40,090 | Fix RuntimeError when loading quantized models with int8 weights (#39366) | {
"login": "akacmazz",
"id": 32853513,
"node_id": "MDQ6VXNlcjMyODUzNTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/32853513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akacmazz",
"html_url": "https://github.com/akacmazz",
"followers_url": "https://api.github.com/users/akacmazz/followers",
"following_url": "https://api.github.com/users/akacmazz/following{/other_user}",
"gists_url": "https://api.github.com/users/akacmazz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akacmazz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akacmazz/subscriptions",
"organizations_url": "https://api.github.com/users/akacmazz/orgs",
"repos_url": "https://api.github.com/users/akacmazz/repos",
"events_url": "https://api.github.com/users/akacmazz/events{/privacy}",
"received_events_url": "https://api.github.com/users/akacmazz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-11T22:52:42 | 2025-08-29T03:08:03 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40090",
"html_url": "https://github.com/huggingface/transformers/pull/40090",
"diff_url": "https://github.com/huggingface/transformers/pull/40090.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40090.patch",
"merged_at": null
} | Skip weight initialization for int8/uint8 quantized weights in _init_weights method.
The normal_() function only works with floating-point tensors, but quantized
models contain int8/uint8 weights which should preserve their loaded values.
Fixes #39366
- Add dtype check before calling normal_() on weights
- Skip initialization for int8/uint8 weights and biases
- Add debug logging when skipping quantized weights
- Add comprehensive tests for quantized weight handling
- Maintain backward compatibility with existing models
# What does this PR do?
Fixes a RuntimeError that occurs when loading llmcompressor W8A8 quantized models. The issue happens because the `_init_weights` method attempts to apply `normal_()` distribution to int8 tensors, which PyTorch doesn't support.
## Before & After
### Before (❌)
```python
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
"RedHatAI/Qwen2.5-VL-7B-Instruct-quantized.w8a8"
)
# RuntimeError: expected a floating-point or complex dtype, but got dtype=torch.int8
## After (✅)
model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
"RedHatAI/Qwen2.5-VL-7B-Instruct-quantized.w8a8")
# Model loads successfully
## Root Cause
The PreTrainedModel._init_weights() method in modeling_utils.py calls module.weight.data.normal_() on all linear and embedding layers. However, quantized models have int8/uint8 weights that:
1. Cannot use normal_() (PyTorch limitation)
2. Should preserve their quantized values anyway
3. Don't require re-initialization
## Fixes # (issue)
- ✅ Add dtype checking before calling normal_()
- ✅ Skip initialization for int8/uint8 weights and biases
- ✅ Preserve quantized values as loaded from model files
- ✅ Add debug logging when skipping quantized layers
- ✅ Maintain full backward compatibility
## Testing
- ✅ Reproduced original error with unmodified code
- ✅ Verified fix works with real quantized model
- ✅ Confirmed 196 quantized layers load correctly
- ✅ Added comprehensive tests for both int8 and float32 scenarios
- ✅ Validated backward compatibility with existing models
## Impact
This fix enables loading of:
- llmcompressor W8A8 quantized models
- Other int8/uint8 quantization formats
- Future compressed-tensors quantized models
Affects: All models inheriting from PreTrainedModel with int8/uint8 quantization
Benefits: Thousands of users can now load quantized models without errors
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40090/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40090/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40089 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40089/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40089/comments | https://api.github.com/repos/huggingface/transformers/issues/40089/events | https://github.com/huggingface/transformers/issues/40089 | 3,311,754,069 | I_kwDOCUB6oc7FZVtV | 40,089 | Could not import module 'AutoTokenizer'. Are this object's requirements defined correctly? | {
"login": "octavianBordeanu",
"id": 27905313,
"node_id": "MDQ6VXNlcjI3OTA1MzEz",
"avatar_url": "https://avatars.githubusercontent.com/u/27905313?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/octavianBordeanu",
"html_url": "https://github.com/octavianBordeanu",
"followers_url": "https://api.github.com/users/octavianBordeanu/followers",
"following_url": "https://api.github.com/users/octavianBordeanu/following{/other_user}",
"gists_url": "https://api.github.com/users/octavianBordeanu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/octavianBordeanu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/octavianBordeanu/subscriptions",
"organizations_url": "https://api.github.com/users/octavianBordeanu/orgs",
"repos_url": "https://api.github.com/users/octavianBordeanu/repos",
"events_url": "https://api.github.com/users/octavianBordeanu/events{/privacy}",
"received_events_url": "https://api.github.com/users/octavianBordeanu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-11T21:44:05 | 2025-09-08T03:09:11 | 2025-08-12T13:17:14 | NONE | null | null | null | null | ### System Info
- torch @ https://download.pytorch.org/whl/cu124/torch-2.6.0%2Bcu124-cp310-cp310-linux_x86_64.whl
- torchaudio @ https://download.pytorch.org/whl/cu124/torchaudio-2.6.0%2Bcu124-cp310-cp310-linux_x86_64.whl
- torchvision @ https://download.pytorch.org/whl/cu124/torchvision-0.21.0%2Bcu124-cp310-cp310-linux_x86_64.whl
- unsloth==2025.6.12
- unsloth_zoo==2025.6.8
- accelerate==1.8.1
- bitsandbytes==0.46.0
- pydantic==2.11.7
- pydantic_core==2.33.2
- tokenizers==0.21.2
- transformers==4.52.4
- treelite==4.4.1
- treescope==0.1.9
- triton==3.2.0
- trl==0.19.0
- xformers==0.0.29.post3
- sympy==1.13.1
- cut-cross-entropy==25.1.1
- Python 3.10.16
- NVIDIA A10G (CUDA Version: 12.5)
- Ubuntu 24.04.2 LTS
### Who can help?
@ArthurZucker @itazap
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from transformers import AutoTokenizer
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2045](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2044), in _LazyModule.__getattr__(self, name)
2044 try:
-> 2045 module = self._get_module(self._class_to_module[name])
2046 value = getattr(module, name)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2075](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2074), in _LazyModule._get_module(self, module_name)
2074 except Exception as e:
-> 2075 raise e
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2073](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2072), in _LazyModule._get_module(self, module_name)
2072 try:
-> 2073 return importlib.import_module("." + module_name, self.__name__)
2074 except Exception as e:
File /usr/local/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
125 level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)
File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:992, in _find_and_load_unlocked(name, import_)
File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)
File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:1004, in _find_and_load_unlocked(name, import_)
ModuleNotFoundError: No module named 'transformers.models.ipynb_checkpoints'
The above exception was the direct cause of the following exception:
ModuleNotFoundError Traceback (most recent call last)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2045](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2044), in _LazyModule.__getattr__(self, name)
2044 try:
-> 2045 module = self._get_module(self._class_to_module[name])
2046 value = getattr(module, name)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2075](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2074), in _LazyModule._get_module(self, module_name)
2074 except Exception as e:
-> 2075 raise e
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2073](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2072), in _LazyModule._get_module(self, module_name)
2072 try:
-> 2073 return importlib.import_module("." + module_name, self.__name__)
2074 except Exception as e:
File /usr/local/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
125 level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)
File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:1006, in _find_and_load_unlocked(name, import_)
File <frozen importlib._bootstrap>:688, in _load_unlocked(spec)
File <frozen importlib._bootstrap_external>:883, in exec_module(self, module)
File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)
File [~/.local/lib/python3.10/site-packages/transformers/models/encoder_decoder/configuration_encoder_decoder.py:20](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/models/encoder_decoder/configuration_encoder_decoder.py#line=19)
19 from ...utils import logging
---> 20 from ..auto import AutoConfig
23 logger = logging.get_logger(__name__)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2048](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2047), in _LazyModule.__getattr__(self, name)
2047 except (ModuleNotFoundError, RuntimeError) as e:
-> 2048 raise ModuleNotFoundError(
2049 f"Could not import module '{name}'. Are this object's requirements defined correctly?"
2050 ) from e
2052 elif name in self._modules:
ModuleNotFoundError: Could not import module 'AutoConfig'. Are this object's requirements defined correctly?
The above exception was the direct cause of the following exception:
ModuleNotFoundError Traceback (most recent call last)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2045](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2044), in _LazyModule.__getattr__(self, name)
2044 try:
-> 2045 module = self._get_module(self._class_to_module[name])
2046 value = getattr(module, name)
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2075](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2074), in _LazyModule._get_module(self, module_name)
2074 except Exception as e:
-> 2075 raise e
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2073](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2072), in _LazyModule._get_module(self, module_name)
2072 try:
-> 2073 return importlib.import_module("." + module_name, self.__name__)
2074 except Exception as e:
File /usr/local/lib/python3.10/importlib/__init__.py:126, in import_module(name, package)
125 level += 1
--> 126 return _bootstrap._gcd_import(name[level:], package, level)
File <frozen importlib._bootstrap>:1050, in _gcd_import(name, package, level)
File <frozen importlib._bootstrap>:1027, in _find_and_load(name, import_)
File <frozen importlib._bootstrap>:1006, in _find_and_load_unlocked(name, import_)
File <frozen importlib._bootstrap>:688, in _load_unlocked(spec)
File <frozen importlib._bootstrap_external>:883, in exec_module(self, module)
File <frozen importlib._bootstrap>:241, in _call_with_frames_removed(f, *args, **kwds)
File [~/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:37](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py#line=36)
29 from ...utils import (
30 cached_file,
31 extract_commit_hash,
(...)
35 logging,
36 )
---> 37 from ..encoder_decoder import EncoderDecoderConfig
38 from .auto_factory import _LazyAutoMapping
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2048](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2047), in _LazyModule.__getattr__(self, name)
2047 except (ModuleNotFoundError, RuntimeError) as e:
-> 2048 raise ModuleNotFoundError(
2049 f"Could not import module '{name}'. Are this object's requirements defined correctly?"
2050 ) from e
2052 elif name in self._modules:
ModuleNotFoundError: Could not import module 'EncoderDecoderConfig'. Are this object's requirements defined correctly?
The above exception was the direct cause of the following exception:
ModuleNotFoundError Traceback (most recent call last)
Cell In[1], line 1
----> 1 from transformers import AutoTokenizer, AutoModelForCausalLM
File [~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2048](https://25rhl5xt9dz0f5sq.ml-c7564e33-277.cdpv2-pr.uf1v-9d9i.cloudera.site/lab/tree/project/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py#line=2047), in _LazyModule.__getattr__(self, name)
2046 value = getattr(module, name)
2047 except (ModuleNotFoundError, RuntimeError) as e:
-> 2048 raise ModuleNotFoundError(
2049 f"Could not import module '{name}'. Are this object's requirements defined correctly?"
2050 ) from e
2052 elif name in self._modules:
2053 try:
ModuleNotFoundError: Could not import module 'AutoTokenizer'. Are this object's requirements defined correctly?
### Expected behavior
The same error arises even if transformers are upgraded to the latest version. Importing transformers by itself does not throw any errors. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40089/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40089/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40088 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40088/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40088/comments | https://api.github.com/repos/huggingface/transformers/issues/40088/events | https://github.com/huggingface/transformers/issues/40088 | 3,311,611,705 | I_kwDOCUB6oc7FYy85 | 40,088 | Default behavior of llama tokenizers breaks text by removing spaces (round trip is not identity function) | {
"login": "keenanpepper",
"id": 10012545,
"node_id": "MDQ6VXNlcjEwMDEyNTQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/10012545?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keenanpepper",
"html_url": "https://github.com/keenanpepper",
"followers_url": "https://api.github.com/users/keenanpepper/followers",
"following_url": "https://api.github.com/users/keenanpepper/following{/other_user}",
"gists_url": "https://api.github.com/users/keenanpepper/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keenanpepper/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keenanpepper/subscriptions",
"organizations_url": "https://api.github.com/users/keenanpepper/orgs",
"repos_url": "https://api.github.com/users/keenanpepper/repos",
"events_url": "https://api.github.com/users/keenanpepper/events{/privacy}",
"received_events_url": "https://api.github.com/users/keenanpepper/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 9105758243,
"node_id": "LA_kwDOCUB6oc8AAAACHr7YIw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?",
"name": "for_v5?",
"color": "35BC94",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-08-11T20:44:15 | 2025-08-21T09:58:11 | 2025-08-13T17:33:42 | NONE | null | null | null | null | ### System Info
transformers version 4.55.0
Python version 3.11.13
Google Colab notebook
### Who can help?
@ArthurZucker and @itazap
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
example text that gets corrupted:
"the word 'social'"
code to reproduce:
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-LLama-3.1-8B-Instruct")
tokenizer.decode(tokenizer.encode("the word 'social'"), skip_special_tokens=True)
```
The result of running the above code is "the word'social'" with a space removed. This is broken behavior.
I notice that if the argument `clean_up_tokenization_spaces=False` is passed, the behavior is correct... but it should just work by default. It should not be broken until you pass some weird parameter to fix the behavior.
### Expected behavior
The result of encoding and then decoding "the word 'social'" with the default settings should be that same string with no spaces removed. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40088/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40088/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40087 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40087/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40087/comments | https://api.github.com/repos/huggingface/transformers/issues/40087/events | https://github.com/huggingface/transformers/pull/40087 | 3,311,112,521 | PR_kwDOCUB6oc6jGp2q | 40,087 | DOCS: Add missing space in SECURITY.md | {
"login": "shivaheidari",
"id": 68950798,
"node_id": "MDQ6VXNlcjY4OTUwNzk4",
"avatar_url": "https://avatars.githubusercontent.com/u/68950798?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shivaheidari",
"html_url": "https://github.com/shivaheidari",
"followers_url": "https://api.github.com/users/shivaheidari/followers",
"following_url": "https://api.github.com/users/shivaheidari/following{/other_user}",
"gists_url": "https://api.github.com/users/shivaheidari/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shivaheidari/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shivaheidari/subscriptions",
"organizations_url": "https://api.github.com/users/shivaheidari/orgs",
"repos_url": "https://api.github.com/users/shivaheidari/repos",
"events_url": "https://api.github.com/users/shivaheidari/events{/privacy}",
"received_events_url": "https://api.github.com/users/shivaheidari/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T18:09:36 | 2025-08-13T12:58:24 | 2025-08-13T12:57:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40087",
"html_url": "https://github.com/huggingface/transformers/pull/40087",
"diff_url": "https://github.com/huggingface/transformers/pull/40087.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40087.patch",
"merged_at": "2025-08-13T12:57:37"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40087/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40087/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40086 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40086/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40086/comments | https://api.github.com/repos/huggingface/transformers/issues/40086/events | https://github.com/huggingface/transformers/pull/40086 | 3,310,996,626 | PR_kwDOCUB6oc6jGRjo | 40,086 | add general hub test for Fast Image Processors in test_image_processing_utils | {
"login": "namgyu-youn",
"id": 152387005,
"node_id": "U_kgDOCRU9vQ",
"avatar_url": "https://avatars.githubusercontent.com/u/152387005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/namgyu-youn",
"html_url": "https://github.com/namgyu-youn",
"followers_url": "https://api.github.com/users/namgyu-youn/followers",
"following_url": "https://api.github.com/users/namgyu-youn/following{/other_user}",
"gists_url": "https://api.github.com/users/namgyu-youn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/namgyu-youn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/namgyu-youn/subscriptions",
"organizations_url": "https://api.github.com/users/namgyu-youn/orgs",
"repos_url": "https://api.github.com/users/namgyu-youn/repos",
"events_url": "https://api.github.com/users/namgyu-youn/events{/privacy}",
"received_events_url": "https://api.github.com/users/namgyu-youn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T17:36:13 | 2025-09-11T14:33:51 | 2025-09-11T14:31:37 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40086",
"html_url": "https://github.com/huggingface/transformers/pull/40086",
"diff_url": "https://github.com/huggingface/transformers/pull/40086.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40086.patch",
"merged_at": "2025-09-11T14:31:37"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
There was no test module for `ViTImageProcessorFast` although it was already implemented. For resolving that issue, this PR builds test module for `ViTImageProcessorFast`
Related Issue/PR: https://github.com/huggingface/transformers/issues/36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
cc @amyeroberts, @qubvel
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40086/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40086/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40085 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40085/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40085/comments | https://api.github.com/repos/huggingface/transformers/issues/40085/events | https://github.com/huggingface/transformers/pull/40085 | 3,310,971,647 | PR_kwDOCUB6oc6jGMR8 | 40,085 | `decoding_method` argument in generate | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T17:28:24 | 2025-08-13T12:45:51 | 2025-08-13T12:45:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40085",
"html_url": "https://github.com/huggingface/transformers/pull/40085",
"diff_url": "https://github.com/huggingface/transformers/pull/40085.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40085.patch",
"merged_at": "2025-08-13T12:45:51"
} | `custom_generate` already allows a Hub/local repository to override `generate()` with a custom implementation.
This PR extends the feature to also accept **custom decoding functions** directly.
When a function is provided, GenerationMixin.generate still handles all input preparation and configuration, and the function only needs to implement the decoding loop.
This enables moving built-in decoding methods such as DoLa and contrastive decoding to the Hub, and power users to easily plug in their own custom loops.
[Docs preview of the new feature here](https://moon-ci-docs.huggingface.co/docs/transformers/pr_40085/en/generation_strategies#reusing-generate-s-preparation-steps-by-passing-a-callable) | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40085/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40085/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40084 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40084/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40084/comments | https://api.github.com/repos/huggingface/transformers/issues/40084/events | https://github.com/huggingface/transformers/pull/40084 | 3,310,809,384 | PR_kwDOCUB6oc6jFqv5 | 40,084 | remove sequence parallel in llama4 | {
"login": "3outeille",
"id": 47445085,
"node_id": "MDQ6VXNlcjQ3NDQ1MDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/47445085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/3outeille",
"html_url": "https://github.com/3outeille",
"followers_url": "https://api.github.com/users/3outeille/followers",
"following_url": "https://api.github.com/users/3outeille/following{/other_user}",
"gists_url": "https://api.github.com/users/3outeille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/3outeille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/3outeille/subscriptions",
"organizations_url": "https://api.github.com/users/3outeille/orgs",
"repos_url": "https://api.github.com/users/3outeille/repos",
"events_url": "https://api.github.com/users/3outeille/events{/privacy}",
"received_events_url": "https://api.github.com/users/3outeille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T16:36:44 | 2025-08-12T22:12:48 | 2025-08-12T22:12:46 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40084",
"html_url": "https://github.com/huggingface/transformers/pull/40084",
"diff_url": "https://github.com/huggingface/transformers/pull/40084.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40084.patch",
"merged_at": "2025-08-12T22:12:45"
} | # What does this PR do?
This PR fix the issue #39835 where a `RuntimeError` was triggered due to mixing tensor and Dtensor. This is due to `_prepare_output_fn` being force to returns a tensor instead of Dtensor.
```
@staticmethod
def _prepare_output_fn(output_layouts, use_local_output, mod, outputs, device_mesh):
outputs = outputs.redistribute(
placements=(Replicate(),), async_op=True
) # maybe we have to replicate ? because next layer is not sharded
return outputs.to_local()
```
An easy fix could be to just to return a Dtensor everytime. However, after some discussion, it is better to remove the `SequenceParallel` use for now as it slows down the overall generation.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "3outeille",
"id": 47445085,
"node_id": "MDQ6VXNlcjQ3NDQ1MDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/47445085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/3outeille",
"html_url": "https://github.com/3outeille",
"followers_url": "https://api.github.com/users/3outeille/followers",
"following_url": "https://api.github.com/users/3outeille/following{/other_user}",
"gists_url": "https://api.github.com/users/3outeille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/3outeille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/3outeille/subscriptions",
"organizations_url": "https://api.github.com/users/3outeille/orgs",
"repos_url": "https://api.github.com/users/3outeille/repos",
"events_url": "https://api.github.com/users/3outeille/events{/privacy}",
"received_events_url": "https://api.github.com/users/3outeille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40084/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40084/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40083 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40083/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40083/comments | https://api.github.com/repos/huggingface/transformers/issues/40083/events | https://github.com/huggingface/transformers/pull/40083 | 3,310,773,169 | PR_kwDOCUB6oc6jFi5b | 40,083 | Fix regression in mllama vision encoder | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T16:26:30 | 2025-08-12T13:52:19 | 2025-08-12T13:29:46 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40083",
"html_url": "https://github.com/huggingface/transformers/pull/40083",
"diff_url": "https://github.com/huggingface/transformers/pull/40083.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40083.patch",
"merged_at": "2025-08-12T13:29:46"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
- Related issue: https://github.com/vllm-project/vllm/issues/22559
- #39643 introduced a regression on mllama because incorrect vision encoder implementation
- This PR correct `all_intermediate_hidden_states` used in vision encoder to fix it
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
cc @ArthurZucker @itazap
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40083/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40083/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40082 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40082/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40082/comments | https://api.github.com/repos/huggingface/transformers/issues/40082/events | https://github.com/huggingface/transformers/pull/40082 | 3,310,501,582 | PR_kwDOCUB6oc6jEoHF | 40,082 | 🚨 Remove DoLa decoding strategy | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T15:13:49 | 2025-08-25T14:34:32 | 2025-08-25T14:33:27 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40082",
"html_url": "https://github.com/huggingface/transformers/pull/40082",
"diff_url": "https://github.com/huggingface/transformers/pull/40082.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40082.patch",
"merged_at": "2025-08-25T14:33:27"
} | Removes Decoding by Contrasting Layers (DoLa) generation strategy from the codebase. Directs users to the `transformers-community/dola` repository.
It has been a warning for a few releases, but now `trust_remote_code=True` is required to run DoLa generation. | {
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40082/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40082/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40081 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40081/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40081/comments | https://api.github.com/repos/huggingface/transformers/issues/40081/events | https://github.com/huggingface/transformers/pull/40081 | 3,310,352,832 | PR_kwDOCUB6oc6jEIdt | 40,081 | Fix `time_spent ` in `notification_service.py`. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T14:34:28 | 2025-08-11T16:31:00 | 2025-08-11T16:30:58 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40081",
"html_url": "https://github.com/huggingface/transformers/pull/40081",
"diff_url": "https://github.com/huggingface/transformers/pull/40081.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40081.patch",
"merged_at": "2025-08-11T16:30:58"
} | # What does this PR do?
Related to #40037 which has some bug. The duration could appear as
It could be
`'71.60s', '(0:01:11)', '====\n'`
or
`'in', '35.01s', '================\n'`. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40081/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40081/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40080 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40080/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40080/comments | https://api.github.com/repos/huggingface/transformers/issues/40080/events | https://github.com/huggingface/transformers/pull/40080 | 3,310,205,137 | PR_kwDOCUB6oc6jDqQ- | 40,080 | Collated reports | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T14:00:49 | 2025-08-13T12:48:17 | 2025-08-13T12:48:15 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40080",
"html_url": "https://github.com/huggingface/transformers/pull/40080",
"diff_url": "https://github.com/huggingface/transformers/pull/40080.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40080.patch",
"merged_at": "2025-08-13T12:48:15"
} | Generates a report of reports for easy access to all results of a CI run
```json
{
"gpu_name": "MI300",
"machine_type": "multi-gpu",
"commit_hash": "1234567",
"total_status_count": {
"passed": 99999,
"failed": 420,
"skipped": 69,
"error": 0,
"null": 0
},
"results": [
{
"model": "aimv2",
"results": [
{
"status": "passed",
"line": "tests/models/aimv2/test_modeling_aimv2.py::Aimv2VisionModelTest::test_can_load_with_global_device_set",
"count": 1
},
... snip ...
``` | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40080/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40079 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40079/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40079/comments | https://api.github.com/repos/huggingface/transformers/issues/40079/events | https://github.com/huggingface/transformers/issues/40079 | 3,310,151,202 | I_kwDOCUB6oc7FTOYi | 40,079 | TypeError in DogeDecoderLayer with MoE Configuration when using dropout() | {
"login": "LoserCheems",
"id": 124847097,
"node_id": "U_kgDOB3ED-Q",
"avatar_url": "https://avatars.githubusercontent.com/u/124847097?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LoserCheems",
"html_url": "https://github.com/LoserCheems",
"followers_url": "https://api.github.com/users/LoserCheems/followers",
"following_url": "https://api.github.com/users/LoserCheems/following{/other_user}",
"gists_url": "https://api.github.com/users/LoserCheems/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LoserCheems/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LoserCheems/subscriptions",
"organizations_url": "https://api.github.com/users/LoserCheems/orgs",
"repos_url": "https://api.github.com/users/LoserCheems/repos",
"events_url": "https://api.github.com/users/LoserCheems/events{/privacy}",
"received_events_url": "https://api.github.com/users/LoserCheems/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-11T13:48:00 | 2025-09-19T08:02:20 | 2025-09-19T08:02:20 | CONTRIBUTOR | null | null | null | null | ### System Info
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.55.0
- Platform: Linux-6.6.87.2-microsoft-standard-WSL2-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.2
- Safetensors version: 0.5.3
- Accelerate version: 1.8.1
- Accelerate config: not found
- DeepSpeed version: 0.17.4
- PyTorch version (accelerator?): 2.8.0a0+5228986c39.nv25.05 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA GeForce RTX 4090
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
There is a TypeError in `DogeDecoderLayer` when using Mixture of Experts (MoE) configuration. Specifically, when running the model with MoE configuration, the `model.generate()` method fails with the following error:
```python
hidden_states = self.mlp(hidden_states)
if isinstance(hidden_states, tuple):
hidden_states, _ = hidden_states
# This handling is correct, but there's no similar handling in the first dropout call
```
```
TypeError: dropout(): argument 'input' (position 1) must be Tensor, not tuple
```
### Expected behavior
In the forward method of `DogeDecoderLayer`, the hidden states after MLP processing are not correctly handled before being passed to the dropout operation.
When configured in MoE mode, the MLP layer (specifically `DogeCDMoE`) returns a tuple containing `(hidden_states, router_logits)` rather than a single tensor. However, the subsequent dropout operation expects to receive a tensor, not a tuple.
The current code on L484-485 has partial handling, but the issue is that this handling occurs after the MLP call but before the dropout call:
This issue causes:
- The `model.generate()` method to fail when using MoE configuration
- Any inference with models using MoE configuration to fail | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40079/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/40079/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40078 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40078/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40078/comments | https://api.github.com/repos/huggingface/transformers/issues/40078/events | https://github.com/huggingface/transformers/pull/40078 | 3,310,049,897 | PR_kwDOCUB6oc6jDJ6J | 40,078 | Update notification service MI325 | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T13:23:48 | 2025-08-12T08:22:53 | 2025-08-12T08:22:52 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40078",
"html_url": "https://github.com/huggingface/transformers/pull/40078",
"diff_url": "https://github.com/huggingface/transformers/pull/40078.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40078.patch",
"merged_at": "2025-08-12T08:22:52"
} | null | {
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40078/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40078/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40077 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40077/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40077/comments | https://api.github.com/repos/huggingface/transformers/issues/40077/events | https://github.com/huggingface/transformers/pull/40077 | 3,309,920,872 | PR_kwDOCUB6oc6jCuHY | 40,077 | Fix repo consistency | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T12:55:40 | 2025-08-12T08:31:54 | 2025-08-11T13:26:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40077",
"html_url": "https://github.com/huggingface/transformers/pull/40077",
"diff_url": "https://github.com/huggingface/transformers/pull/40077.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40077.patch",
"merged_at": "2025-08-11T13:26:22"
} | # What does this PR do?
As per title, I merged a PR without rebasing on `main` | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40077/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40077/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40076 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40076/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40076/comments | https://api.github.com/repos/huggingface/transformers/issues/40076/events | https://github.com/huggingface/transformers/pull/40076 | 3,308,941,235 | PR_kwDOCUB6oc6i_g9q | 40,076 | rm pytorch-triton dependency | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T08:15:08 | 2025-10-07T14:41:34 | 2025-10-07T14:41:34 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40076",
"html_url": "https://github.com/huggingface/transformers/pull/40076",
"diff_url": "https://github.com/huggingface/transformers/pull/40076.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40076.patch",
"merged_at": null
} | # What does this PR do?
After the triton == 3.4.0 release, pytorch-triton dependency is no longer needed by the pytorch team | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40076/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40076/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40075 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40075/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40075/comments | https://api.github.com/repos/huggingface/transformers/issues/40075/events | https://github.com/huggingface/transformers/pull/40075 | 3,308,472,315 | PR_kwDOCUB6oc6i-BFo | 40,075 | Skipping pytree registration in case fsdp is enabled | {
"login": "romitjain",
"id": 11757603,
"node_id": "MDQ6VXNlcjExNzU3NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/romitjain",
"html_url": "https://github.com/romitjain",
"followers_url": "https://api.github.com/users/romitjain/followers",
"following_url": "https://api.github.com/users/romitjain/following{/other_user}",
"gists_url": "https://api.github.com/users/romitjain/gists{/gist_id}",
"starred_url": "https://api.github.com/users/romitjain/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/romitjain/subscriptions",
"organizations_url": "https://api.github.com/users/romitjain/orgs",
"repos_url": "https://api.github.com/users/romitjain/repos",
"events_url": "https://api.github.com/users/romitjain/events{/privacy}",
"received_events_url": "https://api.github.com/users/romitjain/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T05:03:13 | 2025-08-20T08:46:27 | 2025-08-19T09:58:05 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40075",
"html_url": "https://github.com/huggingface/transformers/pull/40075",
"diff_url": "https://github.com/huggingface/transformers/pull/40075.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40075.patch",
"merged_at": "2025-08-19T09:58:05"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #39795
Pytree registration for Dynamic KV Cache prevents FSDP from sharding the Q, K, V cache matrices correctly. This leads to a memory explosion during training (see the attached issue).
This PR fixes it by moving the pytree registration of Dynamic KV Cache from `cache_utils` to `integrations.executorch`, where it is only executed in case `torch.export` is called. This is an improvement over the previous version, where this registration was happening globally.
As a side effect of this PR, I have also moved `is_fsdp_enabled` helper function to a more suitable location -> `integrations.fsdp` (from `modeling_utils`)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #39795
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@Cyrilvallez @SunMarc @gante | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40075/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40075/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40074 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40074/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40074/comments | https://api.github.com/repos/huggingface/transformers/issues/40074/events | https://github.com/huggingface/transformers/pull/40074 | 3,308,238,879 | PR_kwDOCUB6oc6i9T4c | 40,074 | Model card for NLLB | {
"login": "sahil-kabir",
"id": 66221472,
"node_id": "MDQ6VXNlcjY2MjIxNDcy",
"avatar_url": "https://avatars.githubusercontent.com/u/66221472?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sahil-kabir",
"html_url": "https://github.com/sahil-kabir",
"followers_url": "https://api.github.com/users/sahil-kabir/followers",
"following_url": "https://api.github.com/users/sahil-kabir/following{/other_user}",
"gists_url": "https://api.github.com/users/sahil-kabir/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sahil-kabir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sahil-kabir/subscriptions",
"organizations_url": "https://api.github.com/users/sahil-kabir/orgs",
"repos_url": "https://api.github.com/users/sahil-kabir/repos",
"events_url": "https://api.github.com/users/sahil-kabir/events{/privacy}",
"received_events_url": "https://api.github.com/users/sahil-kabir/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-11T02:13:16 | 2025-08-31T16:23:56 | 2025-08-18T17:05:59 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40074",
"html_url": "https://github.com/huggingface/transformers/pull/40074",
"diff_url": "https://github.com/huggingface/transformers/pull/40074.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40074.patch",
"merged_at": "2025-08-18T17:05:59"
} | # What does this PR do?
<!-- Remove if not applicable -->
issue #336979 -> model card for nllb
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40074/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40074/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40073 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40073/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40073/comments | https://api.github.com/repos/huggingface/transformers/issues/40073/events | https://github.com/huggingface/transformers/issues/40073 | 3,308,154,628 | I_kwDOCUB6oc7FLm8E | 40,073 | gpt_oss inference activates *all* experts for every token | {
"login": "shivak",
"id": 22960,
"node_id": "MDQ6VXNlcjIyOTYw",
"avatar_url": "https://avatars.githubusercontent.com/u/22960?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shivak",
"html_url": "https://github.com/shivak",
"followers_url": "https://api.github.com/users/shivak/followers",
"following_url": "https://api.github.com/users/shivak/following{/other_user}",
"gists_url": "https://api.github.com/users/shivak/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shivak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shivak/subscriptions",
"organizations_url": "https://api.github.com/users/shivak/orgs",
"repos_url": "https://api.github.com/users/shivak/repos",
"events_url": "https://api.github.com/users/shivak/events{/privacy}",
"received_events_url": "https://api.github.com/users/shivak/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-11T00:58:21 | 2025-09-19T08:02:22 | 2025-09-19T08:02:22 | NONE | null | null | null | null | ### System Info
N/A
### Who can help?
@SunMarc
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
N/A
### Expected behavior
OpenAI [describes](https://openai.com/index/introducing-gpt-oss/) the gpt-oss models as sparsely activated:
> Each model is a Transformer which leverages mixture-of-experts (MoE[2]) to reduce the number of active parameters needed to process input. gpt-oss-120b activates 5.1B parameters per token, while gpt-oss-20b activates 3.6B. The models have 117b and 21b total parameters respectively.
While reading this comment in the gpt-oss implementation, "all" unfortunately doesn't refer to just all of the top k:
https://github.com/huggingface/transformers/blob/99c747539e07b2e141cb5f13961b72108e9dc864/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L82
The implementation indeed involves the weights of *all* experts (not just top k) for every token during inference:
https://github.com/huggingface/transformers/blob/99c747539e07b2e141cb5f13961b72108e9dc864/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L119
(Recall that gpt-oss-120b's [configuration](https://huggingface.co/openai/gpt-oss-120b/blob/main/config.json) has `num_experts_per_tok=4` and `num_local_experts=128`)
https://github.com/huggingface/transformers/blob/99c747539e07b2e141cb5f13961b72108e9dc864/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L67-L71
The training branch does route only to the top k experts, as in Mixtral's sparse MoE implementation (which was carried over to Qwen2 MoE and Qwen3 MoE).
https://github.com/huggingface/transformers/blob/99c747539e07b2e141cb5f13961b72108e9dc864/src/transformers/models/gpt_oss/modeling_gpt_oss.py#L97-L104 | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40073/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40073/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40072 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40072/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40072/comments | https://api.github.com/repos/huggingface/transformers/issues/40072/events | https://github.com/huggingface/transformers/pull/40072 | 3,307,817,641 | PR_kwDOCUB6oc6i8Lj1 | 40,072 | unpin `torchcodec==0.5.0` and use `torch 2.8` on daily CI | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T17:53:17 | 2025-08-10T20:27:40 | 2025-08-10T20:27:39 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40072",
"html_url": "https://github.com/huggingface/transformers/pull/40072",
"diff_url": "https://github.com/huggingface/transformers/pull/40072.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40072.patch",
"merged_at": "2025-08-10T20:27:39"
} | Reverts huggingface/transformers#40013 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40072/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40072/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40071 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40071/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40071/comments | https://api.github.com/repos/huggingface/transformers/issues/40071/events | https://github.com/huggingface/transformers/issues/40071 | 3,307,779,128 | I_kwDOCUB6oc7FKLQ4 | 40,071 | Issue running model from ImageSegmentationPipeline | {
"login": "LuSrodri",
"id": 70177902,
"node_id": "MDQ6VXNlcjcwMTc3OTAy",
"avatar_url": "https://avatars.githubusercontent.com/u/70177902?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LuSrodri",
"html_url": "https://github.com/LuSrodri",
"followers_url": "https://api.github.com/users/LuSrodri/followers",
"following_url": "https://api.github.com/users/LuSrodri/following{/other_user}",
"gists_url": "https://api.github.com/users/LuSrodri/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LuSrodri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LuSrodri/subscriptions",
"organizations_url": "https://api.github.com/users/LuSrodri/orgs",
"repos_url": "https://api.github.com/users/LuSrodri/repos",
"events_url": "https://api.github.com/users/LuSrodri/events{/privacy}",
"received_events_url": "https://api.github.com/users/LuSrodri/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-10T17:06:14 | 2025-09-19T08:02:24 | 2025-09-19T08:02:24 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.55.0
- Platform: Windows-10-10.0.26100-SP0
- Python version: 3.10.7
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cpu (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@amyeroberts, @qubvel @Rocketknight1
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I was trying running the briaai/RMBG-2.0 model in ImageSegmentationPipeline on Transformers 4.55.0.
Code to run:
```py
from transformers import pipeline
pipeline(
"image-segmentation",
model="briaai/RMBG-2.0",
trust_remote_code=True
)
```
Starting setting to run:
```shell
pip install -U transformers
pip install -U "huggingface_hub[cli]"
hf auth login --token $HF_TOKEN
py .\script.py
```
Error:
```shell
RuntimeError: This model introduces a custom pipeline without specifying its implementation.
```
### Expected behavior
Just running the selected model. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40071/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40070 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40070/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40070/comments | https://api.github.com/repos/huggingface/transformers/issues/40070/events | https://github.com/huggingface/transformers/issues/40070 | 3,307,563,945 | I_kwDOCUB6oc7FJWup | 40,070 | Transformer GGUF support philosophy / naive question | {
"login": "luke14free",
"id": 166602,
"node_id": "MDQ6VXNlcjE2NjYwMg==",
"avatar_url": "https://avatars.githubusercontent.com/u/166602?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luke14free",
"html_url": "https://github.com/luke14free",
"followers_url": "https://api.github.com/users/luke14free/followers",
"following_url": "https://api.github.com/users/luke14free/following{/other_user}",
"gists_url": "https://api.github.com/users/luke14free/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luke14free/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luke14free/subscriptions",
"organizations_url": "https://api.github.com/users/luke14free/orgs",
"repos_url": "https://api.github.com/users/luke14free/repos",
"events_url": "https://api.github.com/users/luke14free/events{/privacy}",
"received_events_url": "https://api.github.com/users/luke14free/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T13:14:42 | 2025-09-19T08:02:25 | 2025-09-19T08:02:25 | NONE | null | null | null | null | Hey there, I am a huge user of both transformers and diffusers and really love the work of the teams at HF. However something is not entirely clear to me regarding the GGUF support by transformers.
GGUF main idea is to be a format that allows to run big models on machines with limited capabilities.
With this in mind, in diffusers (or in comfyui and llama.cpp) I can load gguf files natively and they mostly just work using less vram than the original model.
In transformers however while many models are supported in gguf format, they are loaded as gguf but then immediately dequantized back to fp16/32, which takes a lot of time and ultimately the same vram requirements of the full model. So I don't understand why someone would ever want to do it (at that point you are probably better off loading the full model and not wait for dequantization).
It feels like I am missing something very obvious here, so I wanted to ask for guidance to the community 🙏
It's probably a stupid question but thanks a lot for taking the time to answer me :) | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40070/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40070/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40069 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40069/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40069/comments | https://api.github.com/repos/huggingface/transformers/issues/40069/events | https://github.com/huggingface/transformers/pull/40069 | 3,307,441,804 | PR_kwDOCUB6oc6i7Qa4 | 40,069 | Remove _prepare_flash_attention_from_position_ids | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T10:39:18 | 2025-08-15T13:55:22 | 2025-08-15T12:35:03 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40069",
"html_url": "https://github.com/huggingface/transformers/pull/40069",
"diff_url": "https://github.com/huggingface/transformers/pull/40069.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40069.patch",
"merged_at": "2025-08-15T12:35:03"
} | # What does this PR do?
This function is deprecated and none of its use can be found. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40069/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40069/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40068 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40068/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40068/comments | https://api.github.com/repos/huggingface/transformers/issues/40068/events | https://github.com/huggingface/transformers/pull/40068 | 3,307,437,188 | PR_kwDOCUB6oc6i7Pq4 | 40,068 | Add missing arguments to class constructors | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T10:33:03 | 2025-08-21T14:51:01 | 2025-08-21T10:22:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40068",
"html_url": "https://github.com/huggingface/transformers/pull/40068",
"diff_url": "https://github.com/huggingface/transformers/pull/40068.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40068.patch",
"merged_at": "2025-08-21T10:22:39"
} | # What does this PR do?
Add missing arguments to constructors.
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40068/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40068/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40067 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40067/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40067/comments | https://api.github.com/repos/huggingface/transformers/issues/40067/events | https://github.com/huggingface/transformers/issues/40067 | 3,307,352,711 | I_kwDOCUB6oc7FIjKH | 40,067 | [BUG] No umt5 config for GGUF. This is not supported configuration. | {
"login": "SlimRG",
"id": 39348033,
"node_id": "MDQ6VXNlcjM5MzQ4MDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/39348033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SlimRG",
"html_url": "https://github.com/SlimRG",
"followers_url": "https://api.github.com/users/SlimRG/followers",
"following_url": "https://api.github.com/users/SlimRG/following{/other_user}",
"gists_url": "https://api.github.com/users/SlimRG/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SlimRG/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SlimRG/subscriptions",
"organizations_url": "https://api.github.com/users/SlimRG/orgs",
"repos_url": "https://api.github.com/users/SlimRG/repos",
"events_url": "https://api.github.com/users/SlimRG/events{/privacy}",
"received_events_url": "https://api.github.com/users/SlimRG/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-10T08:53:47 | 2025-09-17T09:15:56 | 2025-09-17T09:15:56 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.54.1
- Platform: Windows-11-10.0.26100-SP0
- Python version: 3.12.10
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.1.0
- Accelerate config: - compute_environment: LOCAL_MACHINE
- distributed_type: NO
- mixed_precision: bf16
- use_cpu: False
- debug: False
- num_processes: 1
- machine_rank: 0
- num_machines: 1
- gpu_ids: all
- rdzv_backend: static
- same_network: True
- main_training_function: main
- enable_cpu_affinity: True
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- dynamo_config: {'dynamo_backend': 'INDUCTOR'}
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA GeForce RTX 4090
### Who can help?
@ArthurZucker
@ArthurZucker and @itazap
@SunMarc @MekkCyber
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```
# --- VAE / VAE ---
vae = AutoencoderKLWan.from_pretrained(
"Wan-AI/Wan2.1-VACE-14B-diffusers",
subfolder="vae",
torch_dtype=torch.float32,
)
# --- Text Encoder / Кодировщик текста ---
text_encoder = UMT5EncoderModel.from_pretrained(
"city96/umt5-xxl-encoder-gguf",
gguf_file="umt5-xxl-encoder-Q8_0.gguf",
torch_dtype=torch.float16,
)
# --- Transformer / Трансформер ---
transformer = WanVACETransformer3DModel.from_single_file(
"https://huggingface.co/QuantStack/Wan2.1_T2V_14B_FusionX_VACE-GGUF/blob/main/Wan2.1_T2V_14B_FusionX_VACE-Q6_K.gguf",
quantization_config=GGUFQuantizationConfig(
compute_dtype=torch.float16
),
torch_dtype=torch.float16,
)
# --- Pipeline assembly / Сборка пайплайна ---
pipe = WanVACEPipeline.from_pretrained(
"Wan-AI/Wan2.1-VACE-14B-diffusers",
vae=vae,
text_encoder=text_encoder,
transformer=transformer,
torch_dtype=torch.float16
)
# --- Scheduler / Планировщик ---
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config, flow_shift=flow_shift)
```
### Expected behavior
[WARNING|configuration_utils.py:622] 2025-08-10 11:46:45,092 >> You are using a model of type t5 to instantiate a model of type umt5. This is not supported for all configurations of models and can yield errors.
I want to cinvert into umt5, not simple t5 | {
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40067/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40067/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40066 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40066/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40066/comments | https://api.github.com/repos/huggingface/transformers/issues/40066/events | https://github.com/huggingface/transformers/pull/40066 | 3,307,343,577 | PR_kwDOCUB6oc6i6_Lt | 40,066 | Change Qwen2RMSNorm to RMSNorm from PyTorch | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T08:43:09 | 2025-08-21T14:51:08 | 2025-08-21T09:58:36 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40066",
"html_url": "https://github.com/huggingface/transformers/pull/40066",
"diff_url": "https://github.com/huggingface/transformers/pull/40066.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40066.patch",
"merged_at": "2025-08-21T09:58:35"
} | # What does this PR do?
Use RMSNorm from PyTorch 2.3+.
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40066/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40066/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40065 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40065/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40065/comments | https://api.github.com/repos/huggingface/transformers/issues/40065/events | https://github.com/huggingface/transformers/pull/40065 | 3,307,288,295 | PR_kwDOCUB6oc6i60kj | 40,065 | Delay float32 upcast in ForCausalLMLoss after filtering ignore_index | {
"login": "starcatmeow",
"id": 19618963,
"node_id": "MDQ6VXNlcjE5NjE4OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/19618963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/starcatmeow",
"html_url": "https://github.com/starcatmeow",
"followers_url": "https://api.github.com/users/starcatmeow/followers",
"following_url": "https://api.github.com/users/starcatmeow/following{/other_user}",
"gists_url": "https://api.github.com/users/starcatmeow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/starcatmeow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/starcatmeow/subscriptions",
"organizations_url": "https://api.github.com/users/starcatmeow/orgs",
"repos_url": "https://api.github.com/users/starcatmeow/repos",
"events_url": "https://api.github.com/users/starcatmeow/events{/privacy}",
"received_events_url": "https://api.github.com/users/starcatmeow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-10T07:45:02 | 2025-08-14T22:26:58 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40065",
"html_url": "https://github.com/huggingface/transformers/pull/40065",
"diff_url": "https://github.com/huggingface/transformers/pull/40065.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40065.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
This PR implements the optimization discussed in #38452, originally proposed by @harshit2997.
Thanks for the original suggestion and discussion.
- Move the float32 upcast in `ForCausalLMLoss` to after filtering out `ignore_index` labels.
- Ensures only relevant logits are upcasted, reducing VRAM usage without affecting correctness.
Fixes #38452
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #38452
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
- @ArthurZucker
- @Rocketknight1
- @harshit2997
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40065/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40065/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40064 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40064/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40064/comments | https://api.github.com/repos/huggingface/transformers/issues/40064/events | https://github.com/huggingface/transformers/pull/40064 | 3,307,268,089 | PR_kwDOCUB6oc6i6xOE | 40,064 | 🌐 [i18n-KO] Translated `videomae.md` to Korean | {
"login": "jihyun-0611",
"id": 78160653,
"node_id": "MDQ6VXNlcjc4MTYwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/78160653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jihyun-0611",
"html_url": "https://github.com/jihyun-0611",
"followers_url": "https://api.github.com/users/jihyun-0611/followers",
"following_url": "https://api.github.com/users/jihyun-0611/following{/other_user}",
"gists_url": "https://api.github.com/users/jihyun-0611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jihyun-0611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jihyun-0611/subscriptions",
"organizations_url": "https://api.github.com/users/jihyun-0611/orgs",
"repos_url": "https://api.github.com/users/jihyun-0611/repos",
"events_url": "https://api.github.com/users/jihyun-0611/events{/privacy}",
"received_events_url": "https://api.github.com/users/jihyun-0611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-10T07:18:34 | 2025-08-11T16:25:21 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40064",
"html_url": "https://github.com/huggingface/transformers/pull/40064",
"diff_url": "https://github.com/huggingface/transformers/pull/40064.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40064.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `videomae.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `videomae.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
@yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40064/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40063 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40063/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40063/comments | https://api.github.com/repos/huggingface/transformers/issues/40063/events | https://github.com/huggingface/transformers/pull/40063 | 3,307,253,517 | PR_kwDOCUB6oc6i6upU | 40,063 | fix: move super().__init__ after vision_config init in Mistral3Config | {
"login": "starcatmeow",
"id": 19618963,
"node_id": "MDQ6VXNlcjE5NjE4OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/19618963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/starcatmeow",
"html_url": "https://github.com/starcatmeow",
"followers_url": "https://api.github.com/users/starcatmeow/followers",
"following_url": "https://api.github.com/users/starcatmeow/following{/other_user}",
"gists_url": "https://api.github.com/users/starcatmeow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/starcatmeow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/starcatmeow/subscriptions",
"organizations_url": "https://api.github.com/users/starcatmeow/orgs",
"repos_url": "https://api.github.com/users/starcatmeow/repos",
"events_url": "https://api.github.com/users/starcatmeow/events{/privacy}",
"received_events_url": "https://api.github.com/users/starcatmeow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T06:57:39 | 2025-08-11T07:21:54 | 2025-08-11T07:21:54 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40063",
"html_url": "https://github.com/huggingface/transformers/pull/40063",
"diff_url": "https://github.com/huggingface/transformers/pull/40063.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40063.patch",
"merged_at": "2025-08-11T07:21:54"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #40062
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #40062
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
- @ArthurZucker
- @zucchini-nlp
- @Cyrilvallez
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40063/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40063/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40062 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40062/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40062/comments | https://api.github.com/repos/huggingface/transformers/issues/40062/events | https://github.com/huggingface/transformers/issues/40062 | 3,307,237,748 | I_kwDOCUB6oc7FIHF0 | 40,062 | [Mistral3] attn_implementation not applied to vision_tower.config in Mistral3Config due to init order | {
"login": "starcatmeow",
"id": 19618963,
"node_id": "MDQ6VXNlcjE5NjE4OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/19618963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/starcatmeow",
"html_url": "https://github.com/starcatmeow",
"followers_url": "https://api.github.com/users/starcatmeow/followers",
"following_url": "https://api.github.com/users/starcatmeow/following{/other_user}",
"gists_url": "https://api.github.com/users/starcatmeow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/starcatmeow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/starcatmeow/subscriptions",
"organizations_url": "https://api.github.com/users/starcatmeow/orgs",
"repos_url": "https://api.github.com/users/starcatmeow/repos",
"events_url": "https://api.github.com/users/starcatmeow/events{/privacy}",
"received_events_url": "https://api.github.com/users/starcatmeow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-10T06:35:40 | 2025-08-11T07:21:55 | 2025-08-11T07:21:55 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.55.0
- Platform: Linux-6.8.0-60-generic-x86_64-with-glibc2.35
- Python version: 3.13.6
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.6.1
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: 0.17.4
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
from transformers import AutoModelForImageTextToText
model = AutoModelForImageTextToText.from_pretrained(
"mistralai/Mistral-Small-3.1-24B-Instruct-2503",
torch_dtype="bfloat16",
attn_implementation="flash_attention_2",
)
print(model.config._attn_implementation) # 'flash_attention_2'
print(model.vision_tower.config._attn_implementation) # 'sdpa'
```
### Expected behavior
Both `model.config._attn_implementation` and `model.vision_tower.config._attn_implementation` should match the passed `attn_implementation` argument.
### Cause
In Mistral3Config, `super().__init__` is called before self.vision_config is initialized.
https://github.com/huggingface/transformers/blob/f4d57f2f0cdff0f63ee74a1f16f442dfaf525231/src/transformers/models/mistral3/configuration_mistral3.py#L88
The `super().__init__` call triggers the `_attn_implementation` setter, which attempts to update the vision config — but at this point vision_config does not exist yet, so the update is skipped.
https://github.com/huggingface/transformers/blob/f4d57f2f0cdff0f63ee74a1f16f442dfaf525231/src/transformers/configuration_utils.py#L419 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40062/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40061 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40061/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40061/comments | https://api.github.com/repos/huggingface/transformers/issues/40061/events | https://github.com/huggingface/transformers/pull/40061 | 3,307,222,889 | PR_kwDOCUB6oc6i6pn8 | 40,061 | 🌐 [i18n-KO] Translated `vitdet.md` to Korean | {
"login": "jihyun-0611",
"id": 78160653,
"node_id": "MDQ6VXNlcjc4MTYwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/78160653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jihyun-0611",
"html_url": "https://github.com/jihyun-0611",
"followers_url": "https://api.github.com/users/jihyun-0611/followers",
"following_url": "https://api.github.com/users/jihyun-0611/following{/other_user}",
"gists_url": "https://api.github.com/users/jihyun-0611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jihyun-0611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jihyun-0611/subscriptions",
"organizations_url": "https://api.github.com/users/jihyun-0611/orgs",
"repos_url": "https://api.github.com/users/jihyun-0611/repos",
"events_url": "https://api.github.com/users/jihyun-0611/events{/privacy}",
"received_events_url": "https://api.github.com/users/jihyun-0611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-10T06:14:27 | 2025-08-10T06:16:34 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40061",
"html_url": "https://github.com/huggingface/transformers/pull/40061",
"diff_url": "https://github.com/huggingface/transformers/pull/40061.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40061.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the ``vitdet.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @jungnerd, @luckyvickyricky, @chelsseeey, @skwh54, @amo33, @maximizemaxwell, @D15M4S -->
<!-- @harheem, @nsbg, @Youngdong2, @xhaktm00, @ssunbear, @ChoHyoungSeo, @judy-choi -->
<!-- @4N3MONE, @Kim-Ju-won, @ahnjj, @FacerAin, @ssum21, @TaskerJang, @HyunZ118 -->
@yijun-lee, @songi104, @chhaewxn, @AhnJoonSung, @jihyun-0611, @seopp, @pyapyapya
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40061/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40061/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40060 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40060/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40060/comments | https://api.github.com/repos/huggingface/transformers/issues/40060/events | https://github.com/huggingface/transformers/pull/40060 | 3,307,198,145 | PR_kwDOCUB6oc6i6llt | 40,060 | Avoid CUDA stream sync | {
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-10T05:38:04 | 2025-08-19T13:13:44 | 2025-08-15T12:37:15 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40060",
"html_url": "https://github.com/huggingface/transformers/pull/40060",
"diff_url": "https://github.com/huggingface/transformers/pull/40060.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40060.patch",
"merged_at": "2025-08-15T12:37:15"
} | # What does this PR do?
Remove more synchronizing CUDA operations reported. | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40060/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40060/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40059 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40059/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40059/comments | https://api.github.com/repos/huggingface/transformers/issues/40059/events | https://github.com/huggingface/transformers/pull/40059 | 3,306,816,972 | PR_kwDOCUB6oc6i5eF9 | 40,059 | Fix Inefficient GELU implementation in GPT2 | {
"login": "null-pointer-access",
"id": 210762976,
"node_id": "U_kgDODI_84A",
"avatar_url": "https://avatars.githubusercontent.com/u/210762976?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/null-pointer-access",
"html_url": "https://github.com/null-pointer-access",
"followers_url": "https://api.github.com/users/null-pointer-access/followers",
"following_url": "https://api.github.com/users/null-pointer-access/following{/other_user}",
"gists_url": "https://api.github.com/users/null-pointer-access/gists{/gist_id}",
"starred_url": "https://api.github.com/users/null-pointer-access/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/null-pointer-access/subscriptions",
"organizations_url": "https://api.github.com/users/null-pointer-access/orgs",
"repos_url": "https://api.github.com/users/null-pointer-access/repos",
"events_url": "https://api.github.com/users/null-pointer-access/events{/privacy}",
"received_events_url": "https://api.github.com/users/null-pointer-access/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-09T20:49:24 | 2025-08-12T13:00:12 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40059",
"html_url": "https://github.com/huggingface/transformers/pull/40059",
"diff_url": "https://github.com/huggingface/transformers/pull/40059.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40059.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #39073 by using fused GELU instead of the custom implementation. This can improve the e2e efficiency by up to 12%.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. #39073
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
cc @ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40059/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40059/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40058 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40058/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40058/comments | https://api.github.com/repos/huggingface/transformers/issues/40058/events | https://github.com/huggingface/transformers/pull/40058 | 3,306,816,053 | PR_kwDOCUB6oc6i5d7v | 40,058 | GGUF Qwen2VL | {
"login": "RevanthGundala",
"id": 93841932,
"node_id": "U_kgDOBZfqDA",
"avatar_url": "https://avatars.githubusercontent.com/u/93841932?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RevanthGundala",
"html_url": "https://github.com/RevanthGundala",
"followers_url": "https://api.github.com/users/RevanthGundala/followers",
"following_url": "https://api.github.com/users/RevanthGundala/following{/other_user}",
"gists_url": "https://api.github.com/users/RevanthGundala/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RevanthGundala/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RevanthGundala/subscriptions",
"organizations_url": "https://api.github.com/users/RevanthGundala/orgs",
"repos_url": "https://api.github.com/users/RevanthGundala/repos",
"events_url": "https://api.github.com/users/RevanthGundala/events{/privacy}",
"received_events_url": "https://api.github.com/users/RevanthGundala/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-09T20:48:02 | 2025-08-12T03:30:12 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40058",
"html_url": "https://github.com/huggingface/transformers/pull/40058",
"diff_url": "https://github.com/huggingface/transformers/pull/40058.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40058.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #40049
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40058/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40058/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40057 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40057/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40057/comments | https://api.github.com/repos/huggingface/transformers/issues/40057/events | https://github.com/huggingface/transformers/pull/40057 | 3,306,605,923 | PR_kwDOCUB6oc6i45jS | 40,057 | updated visualBERT modelcard | {
"login": "Anil-Red",
"id": 104975106,
"node_id": "U_kgDOBkHLAg",
"avatar_url": "https://avatars.githubusercontent.com/u/104975106?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anil-Red",
"html_url": "https://github.com/Anil-Red",
"followers_url": "https://api.github.com/users/Anil-Red/followers",
"following_url": "https://api.github.com/users/Anil-Red/following{/other_user}",
"gists_url": "https://api.github.com/users/Anil-Red/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Anil-Red/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Anil-Red/subscriptions",
"organizations_url": "https://api.github.com/users/Anil-Red/orgs",
"repos_url": "https://api.github.com/users/Anil-Red/repos",
"events_url": "https://api.github.com/users/Anil-Red/events{/privacy}",
"received_events_url": "https://api.github.com/users/Anil-Red/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-09T17:17:09 | 2025-08-13T19:47:32 | 2025-08-13T19:47:32 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40057",
"html_url": "https://github.com/huggingface/transformers/pull/40057",
"diff_url": "https://github.com/huggingface/transformers/pull/40057.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40057.patch",
"merged_at": "2025-08-13T19:47:32"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ✓ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ✓ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
@stevhliu would you please review my PR?
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40057/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40056 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40056/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40056/comments | https://api.github.com/repos/huggingface/transformers/issues/40056/events | https://github.com/huggingface/transformers/issues/40056 | 3,306,568,152 | I_kwDOCUB6oc7FFjnY | 40,056 | Question: How to write a custome tokenizer form scratch | {
"login": "obadx",
"id": 16362655,
"node_id": "MDQ6VXNlcjE2MzYyNjU1",
"avatar_url": "https://avatars.githubusercontent.com/u/16362655?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/obadx",
"html_url": "https://github.com/obadx",
"followers_url": "https://api.github.com/users/obadx/followers",
"following_url": "https://api.github.com/users/obadx/following{/other_user}",
"gists_url": "https://api.github.com/users/obadx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/obadx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/obadx/subscriptions",
"organizations_url": "https://api.github.com/users/obadx/orgs",
"repos_url": "https://api.github.com/users/obadx/repos",
"events_url": "https://api.github.com/users/obadx/events{/privacy}",
"received_events_url": "https://api.github.com/users/obadx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-09T16:39:19 | 2025-09-24T08:03:02 | 2025-09-24T08:03:02 | NONE | null | null | null | null | In this guide you introduced how to write a custom model and custom model configuration: [here](https://huggingface.co/docs/transformers/main/en/custom_models), IN addition I want to create a custom tokenizer form scratch why ?
I have a problem of multilevel transcription: the model takes an input utterance and output a 12 multilingual transcript simultaneously . So I want to design a tokenzier such that it take the whole 12 languages as a dict:
```python
{
"lang1": "text text",
"lang2": "text text",
"lang3": "text text",
}
```
and after tokenization
```python
{
"input_ids":
{
"lang1": "ids of lang 1",
"lang2": "ids of lang 2",
"lang3": "ids of lang 2",
}
}
```
How to do so as I can not find docs of building such custom tkenizer from scratch ? | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40056/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40055 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40055/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40055/comments | https://api.github.com/repos/huggingface/transformers/issues/40055/events | https://github.com/huggingface/transformers/pull/40055 | 3,306,160,513 | PR_kwDOCUB6oc6i3uL1 | 40,055 | Auto-log parallelism info to wandb.config using HF Accelerate | {
"login": "WoosungMyung",
"id": 115716986,
"node_id": "U_kgDOBuWzeg",
"avatar_url": "https://avatars.githubusercontent.com/u/115716986?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WoosungMyung",
"html_url": "https://github.com/WoosungMyung",
"followers_url": "https://api.github.com/users/WoosungMyung/followers",
"following_url": "https://api.github.com/users/WoosungMyung/following{/other_user}",
"gists_url": "https://api.github.com/users/WoosungMyung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/WoosungMyung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/WoosungMyung/subscriptions",
"organizations_url": "https://api.github.com/users/WoosungMyung/orgs",
"repos_url": "https://api.github.com/users/WoosungMyung/repos",
"events_url": "https://api.github.com/users/WoosungMyung/events{/privacy}",
"received_events_url": "https://api.github.com/users/WoosungMyung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-09T09:28:22 | 2025-08-26T12:05:12 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40055",
"html_url": "https://github.com/huggingface/transformers/pull/40055",
"diff_url": "https://github.com/huggingface/transformers/pull/40055.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40055.patch",
"merged_at": null
} | ### What
This PR adds parallelism info non intrusively to `WandbCallback` so that, **when Hugging Face Accelerate is in use**, the callback automatically logs the parallelism sizes to `wandb.config` at train start.
### Why
Users frequently want their distributed setup (e.g., `tp_size`, `dp_replicate_size`) with Weights & Biases for reproducibility and experiment analysis. Today, this is typically done manually on the user side. This change standardizes and automates that step.
This PR is from Issue #39882
@Rocketknight1 @MekkCyber
Thanks for you precious time for reviewing this PR.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40055/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40055/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40054 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40054/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40054/comments | https://api.github.com/repos/huggingface/transformers/issues/40054/events | https://github.com/huggingface/transformers/issues/40054 | 3,306,009,796 | I_kwDOCUB6oc7FDbTE | 40,054 | Whisper transcription accuracy improves when last 1600 samples of input audio are muted | {
"login": "jozefchutka",
"id": 750041,
"node_id": "MDQ6VXNlcjc1MDA0MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/750041?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jozefchutka",
"html_url": "https://github.com/jozefchutka",
"followers_url": "https://api.github.com/users/jozefchutka/followers",
"following_url": "https://api.github.com/users/jozefchutka/following{/other_user}",
"gists_url": "https://api.github.com/users/jozefchutka/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jozefchutka/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jozefchutka/subscriptions",
"organizations_url": "https://api.github.com/users/jozefchutka/orgs",
"repos_url": "https://api.github.com/users/jozefchutka/repos",
"events_url": "https://api.github.com/users/jozefchutka/events{/privacy}",
"received_events_url": "https://api.github.com/users/jozefchutka/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
},
{
"id": 7377881103,
"node_id": "LA_kwDOCUB6oc8AAAABt8GIDw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Whisper",
"name": "Whisper",
"color": "83303E",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-08-09T06:53:37 | 2025-10-27T15:01:12 | null | NONE | null | null | null | null | `### System Info
transcformers version: 4.55.0
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When using a Whisper model in transformers, the transcription differs significantly depending on whether the last 1600 samples (~0.1s at 16 kHz) of the input audio are muted (set to 0.0).
With the original audio, some content is missing in the transcript. When last 1600 samples are muted, the transcript is complete and accurate.
This suggests that Whisper in transformers might mishandle small trailing noise or non-speech at the very end of an audio file.
```py
import torch
import numpy as np
from transformers import pipeline
asr = pipeline(
task="automatic-speech-recognition",
model="openai/whisper-base",
torch_dtype=torch.float32,
device=-1
)
audio = np.fromfile("ch.pcm", dtype=np.float32)
#audio[-1600:] = 0.0
result = asr(
audio,
generate_kwargs={"language": "en"}
)
print(result)
```
Transcript with the original audio:
```
The schoolbooks say it can't be here again chocolate rain.
```
Transcript when muting last samples `audio[-1600:] = 0.0` :
```
Chocolate Rain Some stay dry and others feel a pain chocolate rain Our baby porn will die before the same chocolate rain The school books say it can't
be here again chocolate rain The prison
```
I am using first 30 seconds of the following audio as input https://www.youtube.com/watch?v=EwTZ2xpQwpA , extracting as pcm using ffmpeg (attached [ch.zip](https://github.com/user-attachments/files/21696489/ch.zip) ) :
```sh
ffmpeg -ss 0 -t 30 -i ch.mp4 -filter_complex:a "[0:1]aformat=channel_layouts=mono,aresample=16000[aout]" -map "[aout]" -c:a pcm_f32le -f data ch.pcm -y
```
### Expected behavior
Small trailing silence/noise at the end of the audio should not cause the model to omit words from the transcript.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40054/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40054/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40053 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40053/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40053/comments | https://api.github.com/repos/huggingface/transformers/issues/40053/events | https://github.com/huggingface/transformers/pull/40053 | 3,305,981,152 | PR_kwDOCUB6oc6i3PVZ | 40,053 | Fix Inefficient default GELU implementation in GPT2(#39073) | {
"login": "wenboqian",
"id": 42508752,
"node_id": "MDQ6VXNlcjQyNTA4NzUy",
"avatar_url": "https://avatars.githubusercontent.com/u/42508752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wenboqian",
"html_url": "https://github.com/wenboqian",
"followers_url": "https://api.github.com/users/wenboqian/followers",
"following_url": "https://api.github.com/users/wenboqian/following{/other_user}",
"gists_url": "https://api.github.com/users/wenboqian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wenboqian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wenboqian/subscriptions",
"organizations_url": "https://api.github.com/users/wenboqian/orgs",
"repos_url": "https://api.github.com/users/wenboqian/repos",
"events_url": "https://api.github.com/users/wenboqian/events{/privacy}",
"received_events_url": "https://api.github.com/users/wenboqian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-09T06:20:23 | 2025-08-09T06:21:59 | 2025-08-09T06:21:59 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40053",
"html_url": "https://github.com/huggingface/transformers/pull/40053",
"diff_url": "https://github.com/huggingface/transformers/pull/40053.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40053.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes https://github.com/huggingface/transformers/issues/39073 (https://github.com/huggingface/transformers/issues/39073)
### Environment
- CUDA 12.8
- Ubuntu 20.04
- torch 2.8.0
- transformers 4.55.0
## Fix
Replace GPT-2 configuration default `activation_function` from `"gelu_new"` (uses NewGELUActivation) to `"gelu"` (uses GELUActivation).
Location: `src/transformers/models/gpt2/configuration_gpt2.py`, line 146
```python
# Before
activation_function="gelu_new",
# After
activation_function="gelu",
```
### Performance Improvement
### Inference per iteration (Before vs After)
- throughput: +12.0% (1.12x)
- latency: -10.6% (1.12x faster)
- Activation
- Before: gelu_new
- After: gelu
### Benchmark summary (comparison)
| Metric | Before | After | Change |
| --- | ---: | ---: | ---: |
| Mean time (ms) | 1.307 | 1.168 | -10.6% (1.12x faster) |
| Throughput (tokens/s) | 6,265,653.57 | 7,014,827.81 | +12.0% (1.12x) |
| Tokens/iter | 8,192 | 8,192 | — |
## Testing
### Configuration
- Model: sshleifer/tiny-gpt2
- batch_size: 16
- seq_len: 512
- iterations: 100
#### config.json:
```json
{
"architectures": [
"GPT2LMHeadModel"
],
"attn_pdrop": 0.1,
"bos_token_id": 50256,
"embd_pdrop": 0.1,
"eos_token_id": 50256,
"initializer_range": 0.02,
"layer_norm_epsilon": 1e-05,
"model_type": "gpt2",
"n_ctx": 1024,
"n_embd": 2,
"n_head": 2,
"n_layer": 2,
"n_positions": 1024,
"resid_pdrop": 0.1,
"summary_activation": null,
"summary_first_dropout": 0.1,
"summary_proj_to_labels": true,
"summary_type": "cls_index",
"summary_use_proj": true,
"task_specific_params": {
"text-generation": {
"do_sample": true,
"max_length": 50
}
},
"vocab_size": 50257
}
```
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "wenboqian",
"id": 42508752,
"node_id": "MDQ6VXNlcjQyNTA4NzUy",
"avatar_url": "https://avatars.githubusercontent.com/u/42508752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wenboqian",
"html_url": "https://github.com/wenboqian",
"followers_url": "https://api.github.com/users/wenboqian/followers",
"following_url": "https://api.github.com/users/wenboqian/following{/other_user}",
"gists_url": "https://api.github.com/users/wenboqian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wenboqian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wenboqian/subscriptions",
"organizations_url": "https://api.github.com/users/wenboqian/orgs",
"repos_url": "https://api.github.com/users/wenboqian/repos",
"events_url": "https://api.github.com/users/wenboqian/events{/privacy}",
"received_events_url": "https://api.github.com/users/wenboqian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40053/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40053/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40052 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40052/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40052/comments | https://api.github.com/repos/huggingface/transformers/issues/40052/events | https://github.com/huggingface/transformers/issues/40052 | 3,305,964,413 | I_kwDOCUB6oc7FDQN9 | 40,052 | Previous PRs introduced a bug on Accumulated Gradients Losses | {
"login": "w32zhong",
"id": 1407530,
"node_id": "MDQ6VXNlcjE0MDc1MzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1407530?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/w32zhong",
"html_url": "https://github.com/w32zhong",
"followers_url": "https://api.github.com/users/w32zhong/followers",
"following_url": "https://api.github.com/users/w32zhong/following{/other_user}",
"gists_url": "https://api.github.com/users/w32zhong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/w32zhong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/w32zhong/subscriptions",
"organizations_url": "https://api.github.com/users/w32zhong/orgs",
"repos_url": "https://api.github.com/users/w32zhong/repos",
"events_url": "https://api.github.com/users/w32zhong/events{/privacy}",
"received_events_url": "https://api.github.com/users/w32zhong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-09T06:02:16 | 2025-08-09T19:20:13 | 2025-08-09T19:20:13 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.54.1
- Platform: Linux-5.15.0-131-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.10.0
- Accelerate config: not found
- DeepSpeed version: 0.17.4
- PyTorch version (accelerator?): 2.8.0a0+5228986c39.nv25.06 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
Previous PRs from: https://github.com/huggingface/transformers/pull/35207 and https://github.com/huggingface/transformers/pull/34511
It makes the backward() called after rescaling. This creates a double rescaling both here and in Accelerate:
https://github.com/huggingface/accelerate/blob/23cf4ef8a3b58f016f63eeb158b4aa2c3e79fe6f/src/accelerate/accelerator.py#L2724
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When:
* gradient_accumulation_steps > 1
* not using deepspeed
* num_items_in_batch is None and self.compute_loss_func is None (i.e., when user ignores the GA loss bug)
The final loss is rescaled twice:
```
loss = loss / gradient_accumulation_steps
```
### Expected behavior
It should be rescaled only once. | {
"login": "w32zhong",
"id": 1407530,
"node_id": "MDQ6VXNlcjE0MDc1MzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1407530?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/w32zhong",
"html_url": "https://github.com/w32zhong",
"followers_url": "https://api.github.com/users/w32zhong/followers",
"following_url": "https://api.github.com/users/w32zhong/following{/other_user}",
"gists_url": "https://api.github.com/users/w32zhong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/w32zhong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/w32zhong/subscriptions",
"organizations_url": "https://api.github.com/users/w32zhong/orgs",
"repos_url": "https://api.github.com/users/w32zhong/repos",
"events_url": "https://api.github.com/users/w32zhong/events{/privacy}",
"received_events_url": "https://api.github.com/users/w32zhong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40052/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40052/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40051 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40051/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40051/comments | https://api.github.com/repos/huggingface/transformers/issues/40051/events | https://github.com/huggingface/transformers/pull/40051 | 3,305,882,910 | PR_kwDOCUB6oc6i29lv | 40,051 | Standardize BARTpho model card: badges, new examples, fixed broken im… | {
"login": "eshwanthkartitr",
"id": 111058542,
"node_id": "U_kgDOBp6ebg",
"avatar_url": "https://avatars.githubusercontent.com/u/111058542?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eshwanthkartitr",
"html_url": "https://github.com/eshwanthkartitr",
"followers_url": "https://api.github.com/users/eshwanthkartitr/followers",
"following_url": "https://api.github.com/users/eshwanthkartitr/following{/other_user}",
"gists_url": "https://api.github.com/users/eshwanthkartitr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eshwanthkartitr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eshwanthkartitr/subscriptions",
"organizations_url": "https://api.github.com/users/eshwanthkartitr/orgs",
"repos_url": "https://api.github.com/users/eshwanthkartitr/repos",
"events_url": "https://api.github.com/users/eshwanthkartitr/events{/privacy}",
"received_events_url": "https://api.github.com/users/eshwanthkartitr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-09T05:04:45 | 2025-08-14T16:55:28 | 2025-08-14T16:55:28 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40051",
"html_url": "https://github.com/huggingface/transformers/pull/40051",
"diff_url": "https://github.com/huggingface/transformers/pull/40051.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40051.patch",
"merged_at": "2025-08-14T16:55:28"
} | Updated bartpho.md
# What does this PR do?
Fixes # (issue)
Standardize BARTpho model card
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Please review this @stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40051/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40051/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40050 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40050/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40050/comments | https://api.github.com/repos/huggingface/transformers/issues/40050/events | https://github.com/huggingface/transformers/issues/40050 | 3,305,660,455 | I_kwDOCUB6oc7FCGAn | 40,050 | Support text classification with GPT-OSS models | {
"login": "zyfedward",
"id": 5227392,
"node_id": "MDQ6VXNlcjUyMjczOTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5227392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zyfedward",
"html_url": "https://github.com/zyfedward",
"followers_url": "https://api.github.com/users/zyfedward/followers",
"following_url": "https://api.github.com/users/zyfedward/following{/other_user}",
"gists_url": "https://api.github.com/users/zyfedward/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zyfedward/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zyfedward/subscriptions",
"organizations_url": "https://api.github.com/users/zyfedward/orgs",
"repos_url": "https://api.github.com/users/zyfedward/repos",
"events_url": "https://api.github.com/users/zyfedward/events{/privacy}",
"received_events_url": "https://api.github.com/users/zyfedward/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | closed | false | null | [] | null | [] | 2025-08-09T03:04:57 | 2025-10-22T19:45:39 | 2025-08-14T17:52:37 | CONTRIBUTOR | null | null | null | null | ### Feature request
Support text classification with GPT-OSS models
### Motivation
Let GPT-OSS models dealing with classification problems more efficiently
### Your contribution
PR submission | {
"login": "zyfedward",
"id": 5227392,
"node_id": "MDQ6VXNlcjUyMjczOTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5227392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zyfedward",
"html_url": "https://github.com/zyfedward",
"followers_url": "https://api.github.com/users/zyfedward/followers",
"following_url": "https://api.github.com/users/zyfedward/following{/other_user}",
"gists_url": "https://api.github.com/users/zyfedward/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zyfedward/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zyfedward/subscriptions",
"organizations_url": "https://api.github.com/users/zyfedward/orgs",
"repos_url": "https://api.github.com/users/zyfedward/repos",
"events_url": "https://api.github.com/users/zyfedward/events{/privacy}",
"received_events_url": "https://api.github.com/users/zyfedward/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40050/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40050/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40049 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40049/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40049/comments | https://api.github.com/repos/huggingface/transformers/issues/40049/events | https://github.com/huggingface/transformers/issues/40049 | 3,305,596,817 | I_kwDOCUB6oc7FB2eR | 40,049 | Please support loading Qwen 2.5 VL from GGUF | {
"login": "ihendley",
"id": 2399522,
"node_id": "MDQ6VXNlcjIzOTk1MjI=",
"avatar_url": "https://avatars.githubusercontent.com/u/2399522?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ihendley",
"html_url": "https://github.com/ihendley",
"followers_url": "https://api.github.com/users/ihendley/followers",
"following_url": "https://api.github.com/users/ihendley/following{/other_user}",
"gists_url": "https://api.github.com/users/ihendley/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ihendley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ihendley/subscriptions",
"organizations_url": "https://api.github.com/users/ihendley/orgs",
"repos_url": "https://api.github.com/users/ihendley/repos",
"events_url": "https://api.github.com/users/ihendley/events{/privacy}",
"received_events_url": "https://api.github.com/users/ihendley/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-08-09T02:32:13 | 2025-08-09T02:32:13 | null | NONE | null | null | null | null | ### Feature request
The new [Qwen Image](https://huggingface.co/Qwen/Qwen-Image) uses Qwen 2.5 VL 7B as a text encoder. Given memory constraints, some users may want to load a quantized image model and text encoder for a diffusers QwenImagePipeline, for example:
```
from diffusers import QwenImagePipeline, QwenImageTransformer2DModel, GGUFQuantizationConfig
import torch
from transformers import AutoModelForCausalLM
transformer = QwenImageTransformer2DModel.from_single_file(
"https://huggingface.co/QuantStack/Qwen-Image-GGUF/blob/main/Qwen_Image-Q4_K_M.gguf",
quantization_config=GGUFQuantizationConfig(compute_dtype=torch.bfloat16),
torch_dtype=torch.bfloat16,
config="Qwen/Qwen-Image",
subfolder="transformer",
)
text_encoder = AutoModelForCausalLM.from_pretrained(
"unsloth/Qwen2.5-VL-7B-Instruct-GGUF",
gguf_file="Qwen2.5-VL-7B-Instruct-Q4_K_M.gguf",
torch_dtype=torch.bfloat16,
)
pipe = QwenImagePipeline.from_pretrained(
"Qwen/Qwen-Image",
transformer=transformer,
text_encoder=text_encoder,
torch_dtype=torch.bfloat16,
)
```
However, this currently fails with the error:
ValueError: GGUF model with architecture qwen2vl is not supported yet.
### Motivation
As described above, Qwen 2.5 VL 7B is the text encoder for the new state-of-the-art Qwen Image model, and diffusers will either attempt to download and load the full unquanitzed Qwen 2.5 VL 7B (~15GB), or it will accept a transformers model text_encoder argument, so it would be very useful to be able to load a GGUF model here to save memory.
### Your contribution
With some help getting started and support along the way I could make an attempt at a PR. However it might be quicker if someone with more experience takes the lead. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40049/reactions",
"total_count": 5,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 5,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40049/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40048 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40048/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40048/comments | https://api.github.com/repos/huggingface/transformers/issues/40048/events | https://github.com/huggingface/transformers/pull/40048 | 3,305,490,441 | PR_kwDOCUB6oc6i1uf3 | 40,048 | Fix Inefficient default GELU implementation in GPT2 #39073 | {
"login": "wenboqian",
"id": 42508752,
"node_id": "MDQ6VXNlcjQyNTA4NzUy",
"avatar_url": "https://avatars.githubusercontent.com/u/42508752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wenboqian",
"html_url": "https://github.com/wenboqian",
"followers_url": "https://api.github.com/users/wenboqian/followers",
"following_url": "https://api.github.com/users/wenboqian/following{/other_user}",
"gists_url": "https://api.github.com/users/wenboqian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wenboqian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wenboqian/subscriptions",
"organizations_url": "https://api.github.com/users/wenboqian/orgs",
"repos_url": "https://api.github.com/users/wenboqian/repos",
"events_url": "https://api.github.com/users/wenboqian/events{/privacy}",
"received_events_url": "https://api.github.com/users/wenboqian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-09T00:30:37 | 2025-08-09T00:39:24 | 2025-08-09T00:38:52 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40048",
"html_url": "https://github.com/huggingface/transformers/pull/40048",
"diff_url": "https://github.com/huggingface/transformers/pull/40048.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40048.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #39073 ([issue](https://github.com/huggingface/transformers/issues/39073))
### Environment
- CUDA 12.8
- Ubuntu 20.04
- torch 2.8.0
- transformers 4.55.0
## Fix
Switch `NewGELUActivation.forward` to `nn.functional.gelu(input, approximate="tanh")`.
Location: `src/transformers/activations.py`, line 47
```python
# Before
return 0.5 * input * (1.0 + torch.tanh(math.sqrt(2.0 / math.pi) * (input + 0.044715 * torch.pow(input, 3.0))))
# After
return nn.functional.gelu(input, approximate="tanh")
```
## Testing
### Configuration
- Input: `torch.randn(batch_size, seq_len, hidden_dim)`
- batch_size: 16
- seq_len: 512
- hidden_dim: 768
- num_iterations: 100
### Performance
- Speedup: 7.65x (86.9%)
| Implementation | Time (ms)|
| --- | --- |
| Original (NewGELU) | 0.154 |
| Fused GELU (tanh) | 0.020 |
### Memory
- Original (NewGELU): Peak 679.47 MB, Used 96.00 MB
- Fused GELU (tanh): Peak 631.47 MB, Used 48.00 MB
### Numerical Accuracy
- Max absolute difference: 2.38e-07
- Mean absolute difference: 3.11e-10
- Mean relative error: 2.39e-09
- Conclusion: Numerically equivalent (diff < 1e-5)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "wenboqian",
"id": 42508752,
"node_id": "MDQ6VXNlcjQyNTA4NzUy",
"avatar_url": "https://avatars.githubusercontent.com/u/42508752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wenboqian",
"html_url": "https://github.com/wenboqian",
"followers_url": "https://api.github.com/users/wenboqian/followers",
"following_url": "https://api.github.com/users/wenboqian/following{/other_user}",
"gists_url": "https://api.github.com/users/wenboqian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wenboqian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wenboqian/subscriptions",
"organizations_url": "https://api.github.com/users/wenboqian/orgs",
"repos_url": "https://api.github.com/users/wenboqian/repos",
"events_url": "https://api.github.com/users/wenboqian/events{/privacy}",
"received_events_url": "https://api.github.com/users/wenboqian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40048/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40048/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40047 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40047/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40047/comments | https://api.github.com/repos/huggingface/transformers/issues/40047/events | https://github.com/huggingface/transformers/pull/40047 | 3,305,452,946 | PR_kwDOCUB6oc6i1n5v | 40,047 | Update wavlm.md to match new model card template | {
"login": "reedrya",
"id": 157441470,
"node_id": "U_kgDOCWJdvg",
"avatar_url": "https://avatars.githubusercontent.com/u/157441470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/reedrya",
"html_url": "https://github.com/reedrya",
"followers_url": "https://api.github.com/users/reedrya/followers",
"following_url": "https://api.github.com/users/reedrya/following{/other_user}",
"gists_url": "https://api.github.com/users/reedrya/gists{/gist_id}",
"starred_url": "https://api.github.com/users/reedrya/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/reedrya/subscriptions",
"organizations_url": "https://api.github.com/users/reedrya/orgs",
"repos_url": "https://api.github.com/users/reedrya/repos",
"events_url": "https://api.github.com/users/reedrya/events{/privacy}",
"received_events_url": "https://api.github.com/users/reedrya/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-08-08T23:51:50 | 2025-08-11T16:26:51 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40047",
"html_url": "https://github.com/huggingface/transformers/pull/40047",
"diff_url": "https://github.com/huggingface/transformers/pull/40047.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40047.patch",
"merged_at": null
} | # What does this PR do?
This PR updates the WavLM model card to comply with the format introduced in #36979.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@stevhliu
## Notes
- I did not include the AttentionMaskVisualizer section since I'm unfamiliar. Please advise if that should be added.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40047/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40047/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/40046 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40046/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40046/comments | https://api.github.com/repos/huggingface/transformers/issues/40046/events | https://github.com/huggingface/transformers/issues/40046 | 3,305,376,335 | I_kwDOCUB6oc7FBApP | 40,046 | Recent releases break backwards-compatibility with key_cache | {
"login": "ntenenz",
"id": 8411908,
"node_id": "MDQ6VXNlcjg0MTE5MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8411908?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ntenenz",
"html_url": "https://github.com/ntenenz",
"followers_url": "https://api.github.com/users/ntenenz/followers",
"following_url": "https://api.github.com/users/ntenenz/following{/other_user}",
"gists_url": "https://api.github.com/users/ntenenz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ntenenz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ntenenz/subscriptions",
"organizations_url": "https://api.github.com/users/ntenenz/orgs",
"repos_url": "https://api.github.com/users/ntenenz/repos",
"events_url": "https://api.github.com/users/ntenenz/events{/privacy}",
"received_events_url": "https://api.github.com/users/ntenenz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-08T22:39:53 | 2025-09-17T08:02:33 | 2025-09-17T08:02:33 | NONE | null | null | null | null | ### System Info
transformers 4.54.0+
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
DEVICE = "cuda:0"
MODEL_NAME = "ai21labs/Jamba-v0.1" # or any model that calls cache.key_cache = ...
tok = AutoTokenizer.from_pretrained(MODEL_NAME)
model = AutoModelForCausalLM.from_pretrained(MODEL_NAME, torch_dtype=torch.bfloat16).to(DEVICE)
inp = tok("Mary had a little", return_tensors="pt")["input_ids"].to(DEVICE)
model.generate(inp, max_new_tokens=64) #this will b/c key_cache is a property w/o a setter
```
See [here](https://huggingface.co/ai21labs/Jamba-v0.1/blob/main/modeling_jamba.py#L251) for the line in Jamba where the exception is raised. Jamba is onboarded directly in `transformers` now, but there are likely countless others that are not which rely on this prior behavior.
### Expected behavior
Model should generate w/o raising an exception. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40046/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40046/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40045 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40045/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40045/comments | https://api.github.com/repos/huggingface/transformers/issues/40045/events | https://github.com/huggingface/transformers/pull/40045 | 3,305,213,362 | PR_kwDOCUB6oc6i046o | 40,045 | (small) fix conditional for input_ids and input_embeds in marian | {
"login": "cyntqliu",
"id": 13006944,
"node_id": "MDQ6VXNlcjEzMDA2OTQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/13006944?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyntqliu",
"html_url": "https://github.com/cyntqliu",
"followers_url": "https://api.github.com/users/cyntqliu/followers",
"following_url": "https://api.github.com/users/cyntqliu/following{/other_user}",
"gists_url": "https://api.github.com/users/cyntqliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyntqliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyntqliu/subscriptions",
"organizations_url": "https://api.github.com/users/cyntqliu/orgs",
"repos_url": "https://api.github.com/users/cyntqliu/repos",
"events_url": "https://api.github.com/users/cyntqliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyntqliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-08T21:05:25 | 2025-08-21T13:13:15 | 2025-08-21T13:13:15 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40045",
"html_url": "https://github.com/huggingface/transformers/pull/40045",
"diff_url": "https://github.com/huggingface/transformers/pull/40045.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40045.patch",
"merged_at": "2025-08-21T13:13:14"
} | # What does this PR do?
This PR fixes a small bug in the conditional in `modeling_marian.py`. In issue #39542, the issue raiser noticed that both `input_ids` and `input_embeds` were none, but that got filtered into the conditional check for both being not None.
Related to but does not fix #39542
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
Ran `pytest tests/models/marian/test_modeling_marian.py` to check for regressions.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @Rocketknight1 | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40045/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40045/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40044 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40044/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40044/comments | https://api.github.com/repos/huggingface/transformers/issues/40044/events | https://github.com/huggingface/transformers/pull/40044 | 3,305,132,441 | PR_kwDOCUB6oc6i0o4R | 40,044 | Revert "fix `notification_service.py` about `time_spent`" | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-08T20:23:07 | 2025-08-08T20:36:50 | 2025-08-08T20:32:24 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40044",
"html_url": "https://github.com/huggingface/transformers/pull/40044",
"diff_url": "https://github.com/huggingface/transformers/pull/40044.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40044.patch",
"merged_at": "2025-08-08T20:32:24"
} | Reverts huggingface/transformers#40037
There were some edge case errors not detected until a full run was triggered
https://github.com/huggingface/transformers/actions/runs/16833364235/job/47693598035
```
File "/home/runner/work/transformers/transformers/utils/notification_service.py", line 206, in time
hours, minutes, seconds = int(time_parts[0]), int(time_parts[1]), float(time_parts[2])
ValueError: invalid literal for int() with base 10: '(0'
```
revert for now to avoid next daily CI failing to produce reports.
Will check and fix later. | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40044/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40044/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40043 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40043/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40043/comments | https://api.github.com/repos/huggingface/transformers/issues/40043/events | https://github.com/huggingface/transformers/pull/40043 | 3,305,079,239 | PR_kwDOCUB6oc6i0eEj | 40,043 | Add GptOssForSequenceClassification for GPT-OSS models | {
"login": "zyfedward",
"id": 5227392,
"node_id": "MDQ6VXNlcjUyMjczOTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5227392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zyfedward",
"html_url": "https://github.com/zyfedward",
"followers_url": "https://api.github.com/users/zyfedward/followers",
"following_url": "https://api.github.com/users/zyfedward/following{/other_user}",
"gists_url": "https://api.github.com/users/zyfedward/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zyfedward/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zyfedward/subscriptions",
"organizations_url": "https://api.github.com/users/zyfedward/orgs",
"repos_url": "https://api.github.com/users/zyfedward/repos",
"events_url": "https://api.github.com/users/zyfedward/events{/privacy}",
"received_events_url": "https://api.github.com/users/zyfedward/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-08T19:59:11 | 2025-09-24T02:52:43 | 2025-08-14T16:32:15 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40043",
"html_url": "https://github.com/huggingface/transformers/pull/40043",
"diff_url": "https://github.com/huggingface/transformers/pull/40043.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40043.patch",
"merged_at": "2025-08-14T16:32:14"
} | # What does this PR do?
Add GptOssForSequenceClassification for GPT-OSS models following generic practice for #40050
Testing:
- Tested with single-node (8 A100) and multi-node (16 A100) options, loss curve and AUC are all expected with no issue.
- All testes passed
- `make fixup` passed
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @Rocketknight1
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40043/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40043/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40042 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40042/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40042/comments | https://api.github.com/repos/huggingface/transformers/issues/40042/events | https://github.com/huggingface/transformers/issues/40042 | 3,304,882,234 | I_kwDOCUB6oc7E_IA6 | 40,042 | Support loading glm4moe GGUF | {
"login": "adonishong",
"id": 9166212,
"node_id": "MDQ6VXNlcjkxNjYyMTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/9166212?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adonishong",
"html_url": "https://github.com/adonishong",
"followers_url": "https://api.github.com/users/adonishong/followers",
"following_url": "https://api.github.com/users/adonishong/following{/other_user}",
"gists_url": "https://api.github.com/users/adonishong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adonishong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adonishong/subscriptions",
"organizations_url": "https://api.github.com/users/adonishong/orgs",
"repos_url": "https://api.github.com/users/adonishong/repos",
"events_url": "https://api.github.com/users/adonishong/events{/privacy}",
"received_events_url": "https://api.github.com/users/adonishong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-08-08T18:27:44 | 2025-10-01T22:22:39 | null | NONE | null | null | null | null | ### Feature request
Currently, GGUF versions of GLM-4.5 series MoE models raises "GGUF model with architecture glm4moe is not supported yet" error.
### Motivation
GLM-4.5 series MoE GGUF models will successfully run. That is a fantastic model ...
### Your contribution
nope ... | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40042/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40042/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/40041 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40041/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40041/comments | https://api.github.com/repos/huggingface/transformers/issues/40041/events | https://github.com/huggingface/transformers/pull/40041 | 3,304,824,943 | PR_kwDOCUB6oc6izqFx | 40,041 | [`GPT Big Code`] Fix attention scaling | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] | closed | false | null | [] | null | [] | 2025-08-08T18:02:33 | 2025-08-11T19:02:20 | 2025-08-11T19:01:31 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40041",
"html_url": "https://github.com/huggingface/transformers/pull/40041",
"diff_url": "https://github.com/huggingface/transformers/pull/40041.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40041.patch",
"merged_at": "2025-08-11T19:01:31"
} | Reported internally cc @hmellor @zucchini-nlp | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40041/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40041/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/40040 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40040/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40040/comments | https://api.github.com/repos/huggingface/transformers/issues/40040/events | https://github.com/huggingface/transformers/issues/40040 | 3,304,573,256 | I_kwDOCUB6oc7E98lI | 40,040 | Error when loading gguf file | {
"login": "ZJY0516",
"id": 86695626,
"node_id": "MDQ6VXNlcjg2Njk1NjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/86695626?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZJY0516",
"html_url": "https://github.com/ZJY0516",
"followers_url": "https://api.github.com/users/ZJY0516/followers",
"following_url": "https://api.github.com/users/ZJY0516/following{/other_user}",
"gists_url": "https://api.github.com/users/ZJY0516/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZJY0516/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZJY0516/subscriptions",
"organizations_url": "https://api.github.com/users/ZJY0516/orgs",
"repos_url": "https://api.github.com/users/ZJY0516/repos",
"events_url": "https://api.github.com/users/ZJY0516/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZJY0516/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-08-08T16:19:45 | 2025-08-09T06:36:55 | 2025-08-09T06:36:55 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.55.0
- Platform: Linux-6.8.0-53-generic-x86_64-with-glibc2.39
- Python version: 3.13.5
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA L40
### Who can help?
@SunMarc @MekkCyber
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "google/gemma-3-27b-it"
filename = "/data/datasets/models-hf/gemma-3-27b-it-GGUF/gemma-3-27b-it-Q8_0.gguf"
tokenizer = AutoTokenizer.from_pretrained(model_id, gguf_file=filename)
model = AutoModelForCausalLM.from_pretrained(model_id, gguf_file=filename)
```
### Expected behavior
error log
```
Traceback (most recent call last):
File "/home/zjy/code/vllm-src/test_transfomer.py", line 6, in <module>
tokenizer = AutoTokenizer.from_pretrained(model_id, gguf_file=filename)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/models/auto/tokenization_auto.py", line 1116, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 2069, in from_pretrained
return cls._from_pretrained(
^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_base.py", line 2315, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/models/gemma/tokenization_gemma_fast.py", line 100, in __init__
super().__init__(
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/tokenization_utils_fast.py", line 123, in __init__
gguf_param = load_gguf_checkpoint(kwargs.get("vocab_file"))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/transformers/modeling_gguf_pytorch_utils.py", line 367, in load_gguf_checkpoint
reader = GGUFReader(gguf_checkpoint_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/gguf/gguf_reader.py", line 133, in __init__
self.data = np.memmap(path, mode = mode)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/zjy/code/vllm-src/.venv/lib/python3.12/site-packages/numpy/_core/memmap.py", line 235, in __new__
os.fspath(filename),
^^^^^^^^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not NoneType
```
| {
"login": "ZJY0516",
"id": 86695626,
"node_id": "MDQ6VXNlcjg2Njk1NjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/86695626?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZJY0516",
"html_url": "https://github.com/ZJY0516",
"followers_url": "https://api.github.com/users/ZJY0516/followers",
"following_url": "https://api.github.com/users/ZJY0516/following{/other_user}",
"gists_url": "https://api.github.com/users/ZJY0516/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZJY0516/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZJY0516/subscriptions",
"organizations_url": "https://api.github.com/users/ZJY0516/orgs",
"repos_url": "https://api.github.com/users/ZJY0516/repos",
"events_url": "https://api.github.com/users/ZJY0516/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZJY0516/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40040/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/40039 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/40039/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/40039/comments | https://api.github.com/repos/huggingface/transformers/issues/40039/events | https://github.com/huggingface/transformers/pull/40039 | 3,304,494,597 | PR_kwDOCUB6oc6iykQq | 40,039 | New DynamicSlidingWindowLayer & associated Cache | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-08-08T15:54:39 | 2025-08-12T12:50:49 | 2025-08-12T12:09:52 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40039",
"html_url": "https://github.com/huggingface/transformers/pull/40039",
"diff_url": "https://github.com/huggingface/transformers/pull/40039.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40039.patch",
"merged_at": "2025-08-12T12:09:52"
} | # What does this PR do?
As per the title. To avoid wasting memory for models with sliding window. As I don't want to reintroduce static hybrid caches by default to avoid all the pitfalls of automatic compilation, but don't want to waste that memory, this is definitely the way to go.
The only change that is needed is to pass the `config` to `DynamicCache`, to be able to parse `sliding_window`/`layer_types`. If we don't, then the behavior is exactly the same as before.
See the following figures for an illustration:
- top: Mistral 7B, all layers are sliding, so the cache stops growing after reaching the window size of 4096
- bottom: Gemma 2 9B, 1 out of 2 layers are sliding, so the Cache grows "sublinearly" after reaching the window size of 4096
<img width="569" height="431" alt="Screenshot 2025-08-11 at 19 48 52" src="https://github.com/user-attachments/assets/e7fb1288-7713-4140-a2b2-1af0a723f76a" />
<img width="565" height="431" alt="Screenshot 2025-08-11 at 19 55 49" src="https://github.com/user-attachments/assets/22dbffb8-2e35-4ebf-a5da-47d8587e169c" />
Bonus:
- Gpt OSS 20B: 1 out of 2 layers is sliding with window_size=128, so we basically fully divide memory requirements by 2 throughout the whole range (except super small input sizes < 128 of course) (it has lower absolute cache size because of only 24 layers and head_dim=64)
<img width="574" height="431" alt="Screenshot 2025-08-12 at 14 15 47" src="https://github.com/user-attachments/assets/44e39532-532e-40bc-8f67-a52517c47350" />
Adding the benchmark script for posterity:
```python
from transformers import AutoModelForCausalLM, DynamicCache
import torch
from tqdm import tqdm
model_name = "mistralai/Mistral-7B-v0.1"
# model_name = "google/gemma-2-9b-it"
# model_name = "openai/gpt-oss-20b"
device = 0
model = AutoModelForCausalLM.from_pretrained(model_name, device_map=device, torch_dtype=torch.bfloat16)
input_sizes = torch.linspace(50, 8000, 50).tolist()
old_sizes = []
new_sizes = []
for size in tqdm(input_sizes):
with torch.no_grad():
input = torch.randint(1000, 3000, (1, int(size)), device=device)
# initializing DynamicCache without config will use only full layers
old_output = model(input, past_key_values=DynamicCache(), logits_to_keep=1)
cache = old_output.past_key_values
tot = sum([layer.keys.numel() * 2 * layer.keys.element_size() for layer in cache.layers])
old_sizes.append(tot / 1024**3)
# Initializing it with the config will infer and use the sliding window/hybrid structure
new_output = model(input, past_key_values=DynamicCache(config=model.config), logits_to_keep=1)
cache = new_output.past_key_values
tot = sum([layer.keys.numel() * 2 * layer.keys.element_size() for layer in cache.layers])
new_sizes.append(tot / 1024**3)
import matplotlib.pyplot as plt
plt.figure()
plt.plot(input_sizes, old_sizes, "r", label="before")
plt.plot(input_sizes, new_sizes, "b", label="now")
plt.xlabel("Cache size [tokens]")
plt.ylabel("Cache memory usage [GiB]")
plt.grid()
plt.legend()
plt.show()
``` | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/40039/reactions",
"total_count": 6,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 6,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/40039/timeline | null | null | null | null | true | true |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.