url
string | repository_url
string | labels_url
string | comments_url
string | events_url
string | html_url
string | id
int64 | node_id
string | number
int64 | title
string | user
dict | labels
list | state
string | locked
bool | assignee
dict | assignees
list | milestone
null | comments
list | created_at
timestamp[ms] | updated_at
timestamp[ms] | closed_at
timestamp[ms] | author_association
string | type
dict | active_lock_reason
null | draft
bool | pull_request
dict | body
string | closed_by
dict | reactions
dict | timeline_url
string | performed_via_github_app
null | state_reason
string | sub_issues_summary
dict | issue_dependencies_summary
dict | is_pull_request
bool | is_closed
bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/40940
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40940/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40940/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40940/events
|
https://github.com/huggingface/transformers/pull/40940
| 3,426,395,700
|
PR_kwDOCUB6oc6pFML6
| 40,940
|
Remove nested import logic for torchvision
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T13:47:14
| 2025-09-17T17:34:30
| 2025-09-17T17:34:30
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40940",
"html_url": "https://github.com/huggingface/transformers/pull/40940",
"diff_url": "https://github.com/huggingface/transformers/pull/40940.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40940.patch",
"merged_at": "2025-09-17T17:34:30"
}
|
As the title says, remove the nested logic import as they were causing some issues when used with modular.
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40940/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40940/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40939
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40939/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40939/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40939/events
|
https://github.com/huggingface/transformers/pull/40939
| 3,426,263,085
|
PR_kwDOCUB6oc6pEvbR
| 40,939
|
[t5gemma] fix `get_text_config` and related fixes
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T13:12:22
| 2025-10-01T14:55:31
| 2025-10-01T14:55:27
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40939",
"html_url": "https://github.com/huggingface/transformers/pull/40939",
"diff_url": "https://github.com/huggingface/transformers/pull/40939.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40939.patch",
"merged_at": "2025-10-01T14:55:27"
}
|
# What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40874, #41073, #41239
Follow-up to https://github.com/huggingface/transformers/pull/40454 and #40903
Fixes how we retrieve sub configs in `t5gemma`, and also subsequent bugs. More details in the diff :)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40939/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40939/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40938
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40938/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40938/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40938/events
|
https://github.com/huggingface/transformers/issues/40938
| 3,426,084,950
|
I_kwDOCUB6oc7MNehW
| 40,938
|
RFC for `tokenization` in v5
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 9105758243,
"node_id": "LA_kwDOCUB6oc8AAAACHr7YIw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?",
"name": "for_v5?",
"color": "35BC94",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
},
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null |
[] | 2025-09-17T12:27:17
| 2025-10-13T02:00:49
| null |
COLLABORATOR
| null | null | null | null |
Sharing here our plans for v5 !
Right now, the distinction between tokenizers (e.g. Bart, Albert) isn’t explicit. We don’t know the actual
algorithm (Unigram, WordPiece, etc). The current ConvertSlow mechanism hides this detail. Instead of relying on “convert slow,” we want to make tokenizer definitions explicit (use tokenizers as we use torch for model definition), providing a single source of truth.
:white_check_mark: Plan for v5
1. Remove legacy artifacts
- Remove saving special_tokens_map.json and added_tokens.json
- Remove _eventually_correct_t5_max_length and related old code
- Drop old warnings, legacy flags, unused tf / jax support
- Trim unnecessary deps (protobuf, sentencepiece) and imports from core files. (added tokens decoder etc only needed for sentencepiece)
2. Clean up outdated models
Remove redundant tokenizer definitions (starting with those in convert_slow_tokenizer.py ) + use modular to isolate diffs. Use tokenizers for explicit definition. LlamaTokenizer will be: (giving an OOB trainable tokenizer that IS like llama, you can pass empty stuff and it would be anew):
```python
class LlamaTokenizer(PreTrainedTokenizerFast):
def __init__(self, vocab=None, merges=None, unk_token="<u>", bos_token="<b>", eos_token="<e>"):
self._tokenizer = Tokenizer(BPE(
bpe_vocab,
merges,
unk_token=unk_token,
fuse_unk=True,
byte_fallback=True,
dropout=None,
))
self._tokenizer.normalizer = [
normalizers.Strip(left=False, right=True),
normalizers.Replace(Regex(" {2,}"), "▁"),
]
self._tokenizer.decoder = decoders.Metaspace(replacement=replacement, prepend_scheme=prepend_scheme)
self._tokenizer.post_processor = processors.TemplateProcessing(
single="<b>:0 $A:0 <e>:0",
pair="<b>:0 $A:0 <b>0 $B:1 <b>:1",
special_tokens=[
("<b>", self.original_tokenizer.convert_tokens_to_ids("<b>")),
("<e>", self.original_tokenizer.convert_tokens_to_ids("<e>")),
],
)
```
The `PreTrainedTokenizerFast` will have all the nice logic for `add_eos_token`, `add_bos_token` etc.
4. Simplify call hierarchy
Current flow:
__call__ -> _call_one -> encode -> batch_encode_plus | encode_plus -> _encode_plus (slow) → tokenize + convert_tokens_to_ids -> prepare_for_model (slow)
Too complex → simplify to be reusable & maintainable
Make it friendlier for simple tokenizers (e.g. blt)
- Batch encoding/decoding:encode already supports batching. Update decode to support batch decoding
- Reduce bloat in loading tokenizers: Today we have many calls, rely on the config etc. Make it simpler
5. Update tests: Freeze integration tests, then fully rewrite & simplify given that we expect tokenizers to "work".
6. Migration guide: Provide clear instructions for moving to v5 (conversion)
7. Docs: more and better docs about training tokenizers!
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40938/reactions",
"total_count": 22,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 16,
"confused": 0,
"heart": 6,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40938/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40937
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40937/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40937/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40937/events
|
https://github.com/huggingface/transformers/pull/40937
| 3,426,076,147
|
PR_kwDOCUB6oc6pEHBd
| 40,937
|
Remove repeated import
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T12:24:40
| 2025-09-22T13:02:58
| 2025-09-22T12:57:13
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40937",
"html_url": "https://github.com/huggingface/transformers/pull/40937",
"diff_url": "https://github.com/huggingface/transformers/pull/40937.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40937.patch",
"merged_at": "2025-09-22T12:57:13"
}
|
# What does this PR do?
As the title says..
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40937/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40936
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40936/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40936/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40936/events
|
https://github.com/huggingface/transformers/pull/40936
| 3,425,973,764
|
PR_kwDOCUB6oc6pDwsA
| 40,936
|
rm slow tokenizers
|
{
"login": "itazap",
"id": 31893021,
"node_id": "MDQ6VXNlcjMxODkzMDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/31893021?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/itazap",
"html_url": "https://github.com/itazap",
"followers_url": "https://api.github.com/users/itazap/followers",
"following_url": "https://api.github.com/users/itazap/following{/other_user}",
"gists_url": "https://api.github.com/users/itazap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/itazap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itazap/subscriptions",
"organizations_url": "https://api.github.com/users/itazap/orgs",
"repos_url": "https://api.github.com/users/itazap/repos",
"events_url": "https://api.github.com/users/itazap/events{/privacy}",
"received_events_url": "https://api.github.com/users/itazap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-17T11:56:18
| 2025-10-29T14:28:08
| null |
COLLABORATOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40936",
"html_url": "https://github.com/huggingface/transformers/pull/40936",
"diff_url": "https://github.com/huggingface/transformers/pull/40936.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40936.patch",
"merged_at": null
}
|
Llama POC for simplifying tokenizers (https://github.com/huggingface/transformers/issues/40938)
- no slow tokenizer
- TokenizerFast becomes just Tokenizer
- 2 options for loading a tokenizer: 1) tokenizer.json, 2) a trainable new "blank" tokenizer
TODO:
- for llama we have legacy behaviour in tokenizer.json, support ?
- organize different backends (spm, tiktoken, etc)
tests will be redone but for for testing I'm keeping the testermixin and removed:
- remove comparing slow and fast when we already test integration
- remove tests for slow-only methods like `prepare_for_model` and `build_inputs_with_special_tokens` (is done by fast template processor)
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40936/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40936/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40935
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40935/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40935/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40935/events
|
https://github.com/huggingface/transformers/pull/40935
| 3,425,842,454
|
PR_kwDOCUB6oc6pDUfQ
| 40,935
|
[i18n-bn] Add Bengali language README file
|
{
"login": "saidurpulok",
"id": 59414463,
"node_id": "MDQ6VXNlcjU5NDE0NDYz",
"avatar_url": "https://avatars.githubusercontent.com/u/59414463?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saidurpulok",
"html_url": "https://github.com/saidurpulok",
"followers_url": "https://api.github.com/users/saidurpulok/followers",
"following_url": "https://api.github.com/users/saidurpulok/following{/other_user}",
"gists_url": "https://api.github.com/users/saidurpulok/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saidurpulok/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saidurpulok/subscriptions",
"organizations_url": "https://api.github.com/users/saidurpulok/orgs",
"repos_url": "https://api.github.com/users/saidurpulok/repos",
"events_url": "https://api.github.com/users/saidurpulok/events{/privacy}",
"received_events_url": "https://api.github.com/users/saidurpulok/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T11:16:41
| 2025-09-22T16:51:40
| 2025-09-22T16:51:39
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40935",
"html_url": "https://github.com/huggingface/transformers/pull/40935",
"diff_url": "https://github.com/huggingface/transformers/pull/40935.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40935.patch",
"merged_at": "2025-09-22T16:51:39"
}
|
Added Bengali (বাংলা) localization:
- New file README_bn.md (concise translation of main README structure)
- Added বাংলা link to root README language selector Scope: docs-only, no code or tests impacted, no dependencies.
Motivation: Improve accessibility for Bengali-speaking community.
Reviewer suggestion: @stevhliu (docs).
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40935/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40935/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40934
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40934/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40934/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40934/events
|
https://github.com/huggingface/transformers/pull/40934
| 3,425,803,510
|
PR_kwDOCUB6oc6pDL8-
| 40,934
|
[models] remove unused `import torch.utils.checkpoint`
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T11:04:40
| 2025-09-17T15:38:01
| 2025-09-17T15:37:56
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40934",
"html_url": "https://github.com/huggingface/transformers/pull/40934",
"diff_url": "https://github.com/huggingface/transformers/pull/40934.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40934.patch",
"merged_at": "2025-09-17T15:37:56"
}
|
# What does this PR do?
(See title)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40934/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40933
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40933/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40933/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40933/events
|
https://github.com/huggingface/transformers/issues/40933
| 3,425,776,920
|
I_kwDOCUB6oc7MMTUY
| 40,933
|
`UserWarning: `seed_generator` is deprecated and will be removed in a future version.`
|
{
"login": "khteh",
"id": 3871483,
"node_id": "MDQ6VXNlcjM4NzE0ODM=",
"avatar_url": "https://avatars.githubusercontent.com/u/3871483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/khteh",
"html_url": "https://github.com/khteh",
"followers_url": "https://api.github.com/users/khteh/followers",
"following_url": "https://api.github.com/users/khteh/following{/other_user}",
"gists_url": "https://api.github.com/users/khteh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/khteh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/khteh/subscriptions",
"organizations_url": "https://api.github.com/users/khteh/orgs",
"repos_url": "https://api.github.com/users/khteh/repos",
"events_url": "https://api.github.com/users/khteh/events{/privacy}",
"received_events_url": "https://api.github.com/users/khteh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T10:55:51
| 2025-09-17T12:28:51
| 2025-09-17T12:28:51
|
NONE
| null | null | null | null |
`transformers==4.56.1`
```
/home/khteh/.local/share/virtualenvs/pAIthon-GaqEDHQT/lib/python3.13/site-packages/transformers/generation/tf_utils.py:465: UserWarning: `seed_generator` is deprecated and will be removed in a future version.
warnings.warn("`seed_generator` is deprecated and will be removed in a future version.", UserWarning)
```
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40933/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40932
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40932/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40932/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40932/events
|
https://github.com/huggingface/transformers/issues/40932
| 3,425,733,859
|
I_kwDOCUB6oc7MMIzj
| 40,932
|
Inconsistent handling of tokenizer bos_token
|
{
"login": "fxmarty-amd",
"id": 180171742,
"node_id": "U_kgDOCr0z3g",
"avatar_url": "https://avatars.githubusercontent.com/u/180171742?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxmarty-amd",
"html_url": "https://github.com/fxmarty-amd",
"followers_url": "https://api.github.com/users/fxmarty-amd/followers",
"following_url": "https://api.github.com/users/fxmarty-amd/following{/other_user}",
"gists_url": "https://api.github.com/users/fxmarty-amd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fxmarty-amd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fxmarty-amd/subscriptions",
"organizations_url": "https://api.github.com/users/fxmarty-amd/orgs",
"repos_url": "https://api.github.com/users/fxmarty-amd/repos",
"events_url": "https://api.github.com/users/fxmarty-amd/events{/privacy}",
"received_events_url": "https://api.github.com/users/fxmarty-amd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1834056635,
"node_id": "MDU6TGFiZWwxODM0MDU2NjM1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Core:%20Tokenization",
"name": "Core: Tokenization",
"color": "FF4446",
"default": false,
"description": "Internals of the library; Tokenization."
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 9105758243,
"node_id": "LA_kwDOCUB6oc8AAAACHr7YIw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for_v5?",
"name": "for_v5?",
"color": "35BC94",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T10:43:21
| 2025-10-26T08:02:20
| 2025-10-26T08:02:20
|
CONTRIBUTOR
| null | null | null | null |
### System Info
```
- `transformers` version: 4.55.4
- Platform: Linux-6.8.0-78-generic-x86_64-with-glibc2.39
- Python version: 3.12.11
- Huggingface_hub version: 0.34.6
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+rocm6.4 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: AMD Instinct MI300X
```
### Who can help?
@ArthurZucker @itazap
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Hi,
I noticed that `tokenizer.add_bos_token` is an inconsistent attribute for various architectures in Transformers. Specifically, some architectures do not have an attribute `tokenizer.add_bos_token`, and they either add or don't add bos_token (e.g. `meta-llama/Llama-3.1-70B-Instruct` and `openai/gpt-oss-20b` below).
Moreover, for the tokenizers that have the attribute, I noticed that the key `add_bos_token` is not always there in the tokenizer_config.json (e.g. `HuggingFaceTB/SmolLM-135M` below).
<img width="2098" height="425" alt="Image" src="https://github.com/user-attachments/assets/5b64bb2b-b298-4988-8199-32ab57b5a048" />
This is an issue as AFAIK we are currently not able to tell from the tokenizer_config.json or from a loaded tokenizer whether a model was trained with bos tokens and whether bos tokens are effectively used by default during inference in Transformers (e.g. llama & gpt-oss inconsistency).
Related: https://github.com/EleutherAI/lm-evaluation-harness/issues/3295
The issue also exists in lighteval: https://github.com/huggingface/lighteval/blob/16318bb4cc736cfb29d72eb31bd19c55d0283565/src/lighteval/models/nanotron/nanotron_model.py#L342-L343
Thank you!
```python
model_ids = [
"Qwen/Qwen1.5-MoE-A2.7B-Chat",
"meta-llama/Llama-3.1-70B-Instruct",
"HuggingFaceTB/SmolLM-135M",
"Qwen/Qwen2.5-0.5B-Instruct",
"microsoft/Phi-4-mini-instruct",
"deepseek-ai/DeepSeek-R1",
"meta-llama/Llama-2-7b-chat-hf",
"unsloth/Llama-3.2-1B-Instruct",
"facebook/opt-125m",
"openai/gpt-oss-20b",
"google/gemma-3-270m",
"RedHatAI/Llama-4-Scout-17B-16E-Instruct-FP8-dynamic",
"NousResearch/Hermes-3-Llama-3.1-405B"
]
results = {}
for model_id in model_ids:
results[model_id] = {}
print("\n-----", model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
prompt = "Today I am in Paris"
inp = tokenizer(prompt, return_tensors="pt")
results[model_id]["tokenizer.add_bos_token"] = getattr(tokenizer, "add_bos_token", "no attribute")
print(" tokenizer.add_bos_token:", results[model_id]["tokenizer.add_bos_token"])
results[model_id]["tokenizer.bos_token_id"] = getattr(tokenizer, "bos_token_id", "no attribute")
print(" tokenizer.bos_token_id:", results[model_id]["tokenizer.bos_token_id"])
filepath = huggingface_hub.hf_hub_download(model_id, "tokenizer_config.json")
with open(filepath) as f:
tokenizer_config = json.load(f)
is_in_tokenizer_config = "add_bos_token" in tokenizer_config
val = tokenizer_config.get("add_bos_token", "N/A")
results[model_id]["'add_bos_token'` key is in tokenizer_config.json"] = is_in_tokenizer_config
results[model_id]["tokenizer_config.json's add_bos_token value"] = val
print(f" `'add_bos_token'` key is in tokenizer_config.json: {results[model_id]["'add_bos_token'` key is in tokenizer_config.json"]} (value: {results[model_id]["tokenizer_config.json's add_bos_token value"]})")
if hasattr(tokenizer, "bos_token_id") and tokenizer.bos_token_id is not None:
is_added = (torch.sum(inp["input_ids"] == tokenizer.bos_token_id) > 0).item()
print(" bos_token_id added by default:", is_added)
else:
is_added = False
print(" bos_token_id added by default: False")
results[model_id]["bos_token_id added by default in sequence"] = is_added
```
### Expected behavior
Consistent way to detect whether bos tokens are used by default during tokenization
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40932/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40932/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40931
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40931/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40931/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40931/events
|
https://github.com/huggingface/transformers/pull/40931
| 3,425,639,382
|
PR_kwDOCUB6oc6pCneC
| 40,931
|
🚨 [unbloating] unify `TypedDict` usage in processing
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T10:16:12
| 2025-10-03T12:17:59
| 2025-10-03T12:17:59
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40931",
"html_url": "https://github.com/huggingface/transformers/pull/40931",
"diff_url": "https://github.com/huggingface/transformers/pull/40931.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40931.patch",
"merged_at": "2025-10-03T12:17:59"
}
|
# What does this PR do?
This PR refactors how `TypedDicts` are used across processing classes to cut down duplication and avoid mismatches. Key updates:
* We previously had two separate “base TypedDicts” for images (one in `processing`, one in `fast image processing`). They were identical, both defining the same kwargs. Now we keep a single copy and just import it where needed.
* For models with non-standard kwargs, we often forget to define new `ModelVideosKwargs` / `ModelImagesKwargs`. They should exist to properly merge kwargs with `ModelProcessingKwargs`. This PR removes the manual step, we now dynamically obtain them at runtime from the preprocessor class attrbiutes.
* The base slow image processor now also exposes a `valid_kwargs` attribute as a typed dict. With this, both slow and fast image processors share a consistent view of available kwargs
* Redundant code where the overwritten part is same as defaults from Mixin are deleted
* Call method on slow image processors also shows hints for kwargs now
<img width="606" height="378" alt="image" src="https://github.com/user-attachments/assets/195f8122-1eba-4e08-b6c3-346acbbfe062" />
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40931/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40931/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40930
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40930/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40930/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40930/events
|
https://github.com/huggingface/transformers/pull/40930
| 3,425,418,045
|
PR_kwDOCUB6oc6pB3JA
| 40,930
|
Fix `Glm4vMoeIntegrationTest`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T09:16:26
| 2025-09-17T16:21:20
| 2025-09-17T16:21:19
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40930",
"html_url": "https://github.com/huggingface/transformers/pull/40930",
"diff_url": "https://github.com/huggingface/transformers/pull/40930.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40930.patch",
"merged_at": "2025-09-17T16:21:18"
}
|
# What does this PR do?
This integration test class takes > 3 hours to finish.
https://github.com/huggingface/transformers/actions/runs/17784986682/job/50551078690
The model is very large (despite being MOE) and the tests loading the model by offloading to cpu/disk.
Even with `max_new_tokens=10`, one test already takes 16 minutes.
This PR combines several tests into one and reduces the total number of tests to only 3 tests.
The whole integration tests runs in 30 minutes now (still slow however).
The disadvantage is that we don't have more complete outputs to compare with.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40930/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40930/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40929
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40929/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40929/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40929/events
|
https://github.com/huggingface/transformers/pull/40929
| 3,425,367,129
|
PR_kwDOCUB6oc6pBsI5
| 40,929
|
Minor fix for #40727
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T09:02:19
| 2025-09-17T09:42:16
| 2025-09-17T09:42:14
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40929",
"html_url": "https://github.com/huggingface/transformers/pull/40929",
"diff_url": "https://github.com/huggingface/transformers/pull/40929.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40929.patch",
"merged_at": "2025-09-17T09:42:14"
}
|
# What does this PR do?
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40929/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40929/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40928
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40928/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40928/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40928/events
|
https://github.com/huggingface/transformers/pull/40928
| 3,424,844,576
|
PR_kwDOCUB6oc6o_7kT
| 40,928
|
🚨Refactor: Update text2text generation pipelines to use max_new_tokens…
|
{
"login": "lilin-1",
"id": 177207022,
"node_id": "U_kgDOCo_27g",
"avatar_url": "https://avatars.githubusercontent.com/u/177207022?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lilin-1",
"html_url": "https://github.com/lilin-1",
"followers_url": "https://api.github.com/users/lilin-1/followers",
"following_url": "https://api.github.com/users/lilin-1/following{/other_user}",
"gists_url": "https://api.github.com/users/lilin-1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lilin-1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lilin-1/subscriptions",
"organizations_url": "https://api.github.com/users/lilin-1/orgs",
"repos_url": "https://api.github.com/users/lilin-1/repos",
"events_url": "https://api.github.com/users/lilin-1/events{/privacy}",
"received_events_url": "https://api.github.com/users/lilin-1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T06:18:18
| 2025-09-24T11:55:33
| 2025-09-24T11:54:55
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40928",
"html_url": "https://github.com/huggingface/transformers/pull/40928",
"diff_url": "https://github.com/huggingface/transformers/pull/40928.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40928.patch",
"merged_at": "2025-09-24T11:54:55"
}
|
---
name: Pull Request
---
about: Create a pull request to contribute to 🤗 Transformers
---
title: "Refactor: Update text2text generation pipelines to use max_new_tokens and resolve max_length warning"
---
labels: bug, summarization
---
assignees: ''
---
## Related Issue
Closes #40768
## Summary
This PR fixes an incorrect warning in `SummarizationPipeline` where a `max_length`-based message is emitted even when `max_new_tokens` is explicitly provided and used. The change standardizes the logic and messaging to consistently respect and reference `max_new_tokens` so users receive accurate guidance.
## Reproduction
1. Case A — Provide `max_new_tokens` (recommended):
```python
from transformers import pipeline, AutoTokenizer
summarizer = pipeline("summarization", model="Falconsai/text_summarization", device_map="auto")
text = (
"This is a long article about the history of artificial intelligence, covering its origins, "
"key milestones, and future prospects. It delves into various subfields like machine learning, "
"natural language processing, and computer vision, highlighting their impact on modern technology "
"and society. The article also discusses the ethical considerations and challenges associated with "
"the rapid advancement of AI."
)
# Optional: show tokenized input length
tokenizer = AutoTokenizer.from_pretrained("Falconsai/text_summarization")
input_length = len(tokenizer.encode(text, add_special_tokens=False))
print(f"Input text length (after tokenization): {input_length}")
outputs = summarizer([text], max_new_tokens=64, min_length=10, do_sample=False)
print(outputs)
```
You may observe a warning that references `max_length` even though `max_new_tokens` is set.
2. Case B — Provide only `max_length` when a default `max_new_tokens` exists:
```python
outputs = summarizer([text], max_length=200, do_sample=False)
```
In pipelines/configurations where a default `max_new_tokens` is present (e.g., via `generation_config` or the pipeline default), generation is still constrained by that `max_new_tokens` value, effectively overriding `max_length`.
## Observed Behavior
- A warning suggests tuning `max_length` with a value derived from the input length (for example, showing `max_length=200`) although generation is in fact constrained by `max_new_tokens` (e.g., 64). The generated summary respects `max_new_tokens`, but the warning is confusing and misleading.
- When only `max_length` is provided but a default `max_new_tokens` exists, generation remains bounded by that `max_new_tokens` default
## Background and Rationale
- The input-checking code dates back to an older generation logic where `max_length` was commonly the controlling parameter. Over time, generation APIs encouraged users to prefer `max_new_tokens`. However, the checker can still read model/generation defaults (for example, a default `max_length` from `generation_config`) even when runtime priority is already overridden by a user-specified `max_new_tokens`. This causes the checker to infer an outdated `max_length` and emit a misleading warning.
- In current generation behavior, when both `max_length` and `max_new_tokens` are provided, `max_new_tokens` takes precedence and effectively constrains the output length. Setting `max_length` alongside `max_new_tokens` is therefore not meaningful and may trigger additional prompts/warnings elsewhere. This is why the check is updated to reflect that precedence and avoid encouraging users to tune `max_length` when `max_new_tokens` is present.
- Additionally, in pipelines/configs where a default `max_new_tokens` exists, providing only `max_length` still leaves the effective generation limit governed by `max_new_tokens` (default).
## Implementation Details
- **Update `check_inputs` methods in `SummarizationPipeline`, `TranslationPipeline`, and other relevant text2text generation pipelines**:
- These methods will be modified to accurately determine the effective `max_new_tokens` value, taking into account user-provided arguments, model defaults, and generation configuration.
- The warning logic will be adjusted to:
- Only reference `max_new_tokens` in warnings related to output length constraints.
- Avoid emitting misleading warnings based on outdated `max_length` values from model defaults when `max_new_tokens` is the actual controlling parameter.
## Impact and Compatibility
- No breaking changes. Behavior of generation is unchanged; only the warning logic and message are corrected to match `max_new_tokens`.
- Backwards-compatible with existing scripts; users will see clearer, more relevant warnings.
## Tests and Quality
- Added/updated tests to cover:
- Precedence and correctness when `max_new_tokens` is provided.
- Warning text alignment with `max_new_tokens`.
- The case where only `max_length` is provided but a default `max_new_tokens` exists.
- Local quality checks executed per contributor guide:
- Style/formatting: `make fixup` (or `make style`) ran cleanly.
- Lint/quality: `make quality` ran without issues on the modified files.
## Notes for Reviewers
This is submitted by a beginner contributor. I would greatly appreciate feedback and a response, whether the PR is accepted or not. If you decide not to merge, please share the reasons so I can learn and improve future contributions. Thank you for your time and guidance!
## PR Checklist
Please verify the following before submitting the PR:
- [x] Pull Request title summarizes the contribution.
- [x] Pull Request description links the associated Issue (e.g., `Closes #40768`).
- [x] Existing tests pass locally.
- [x] Added or updated tests where appropriate.
- [x] All public methods have informative docstrings.
- [x] No large non-text assets (images/videos) are added to the repository.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40928/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40928/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40927
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40927/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40927/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40927/events
|
https://github.com/huggingface/transformers/issues/40927
| 3,424,652,563
|
I_kwDOCUB6oc7MIA0T
| 40,927
|
PreTrainedTokenizer requires self.get_vocab() which is no longer implemented
|
{
"login": "eugenekwaNeuromics",
"id": 163505331,
"node_id": "U_kgDOCb7ksw",
"avatar_url": "https://avatars.githubusercontent.com/u/163505331?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eugenekwaNeuromics",
"html_url": "https://github.com/eugenekwaNeuromics",
"followers_url": "https://api.github.com/users/eugenekwaNeuromics/followers",
"following_url": "https://api.github.com/users/eugenekwaNeuromics/following{/other_user}",
"gists_url": "https://api.github.com/users/eugenekwaNeuromics/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eugenekwaNeuromics/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eugenekwaNeuromics/subscriptions",
"organizations_url": "https://api.github.com/users/eugenekwaNeuromics/orgs",
"repos_url": "https://api.github.com/users/eugenekwaNeuromics/repos",
"events_url": "https://api.github.com/users/eugenekwaNeuromics/events{/privacy}",
"received_events_url": "https://api.github.com/users/eugenekwaNeuromics/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T04:54:53
| 2025-10-01T13:08:57
| 2025-10-01T13:08:57
|
NONE
| null | null | null | null |
### System Info
Hi, there is an issue with the PreTrainedTokenizer class in the current transformers v4.56.1. Line 1516 of `tokenization_utils_base.py` curently returns a `NotImplementedError()` for the `PreTrainedTokenizerBase.get_vocab(self)` function. However, PreTrainedTokenizer still requires the `.get_vocab` function at two places in `tokenization_utils.py`:
- Line 546 for `PreTrainedTokenizer._add_tokens(self, new_tokens, special_tokens)`
`current_vocab = self.get_vocab().copy()`
- Line 510 for `PreTrainedTokenizer._update_total_vocab_size(self)`
`self.total_vocab_size = len(self.get_vocab())`
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Bug encountered when attempting to use the `CharacterTokenizer` class for [standalone_hyenadna.py](https://github.com/HazyResearch/hyena-dna/blob/main/standalone_hyenadna.py). The code requires the `PreTrainedTokenizer` class from `transformers.tokenization_utils`. Problem encountered upon `super().__init__(...)`, traceback thereafter:
.../lib/python3.12/site-packages/transformers/tokenization_utils.py:438 in PreTrainedTokenizer.__init__(self, **kwargs)
.../lib/python3.12/site-packages/transformers/tokenization_utils.py:546 in PreTrainedTokenizer._add_tokens(self, new_tokens, special_tokens)
.../lib/python3.12/site-packages/transformers/tokenization_utils_base.py:1516 in PreTrainedTokenizerBase.get_vocab(self)
### Expected behavior
PreTrainedTokenizerBase no longer implements .get_vocab, but the PreTrainedTokenizer still requires it.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40927/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40927/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40926
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40926/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40926/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40926/events
|
https://github.com/huggingface/transformers/issues/40926
| 3,424,562,789
|
I_kwDOCUB6oc7MHq5l
| 40,926
|
Contrastive search doesn't work on Gemma3
|
{
"login": "jood-canva",
"id": 206628664,
"node_id": "U_kgDODFDnOA",
"avatar_url": "https://avatars.githubusercontent.com/u/206628664?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jood-canva",
"html_url": "https://github.com/jood-canva",
"followers_url": "https://api.github.com/users/jood-canva/followers",
"following_url": "https://api.github.com/users/jood-canva/following{/other_user}",
"gists_url": "https://api.github.com/users/jood-canva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jood-canva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jood-canva/subscriptions",
"organizations_url": "https://api.github.com/users/jood-canva/orgs",
"repos_url": "https://api.github.com/users/jood-canva/repos",
"events_url": "https://api.github.com/users/jood-canva/events{/privacy}",
"received_events_url": "https://api.github.com/users/jood-canva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T03:59:59
| 2025-09-17T12:12:46
| 2025-09-17T12:12:11
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.53.3
- Platform: Linux-6.8.0-1036-aws-x86_64-with-glibc2.35
- Python version: 3.11.10
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.4.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA L40S
### Who can help?
Hi @zucchini-nlp and @gante
I think there is a bug with the new way contrastive search has been implemented. Following https://github.com/huggingface/transformers/pull/40428 we now have to use the community package https://huggingface.co/transformers-community/contrastive-search . However I am getting the error
```
File "/home/coder/work/canva/tools/build/python/third_party/.venv/lib/python3.11/site-packages/transformers/generation/utils.py", line 2363, in generate
custom_generate_function = self.load_custom_generate(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coder/work/canva/tools/build/python/third_party/.venv/lib/python3.11/site-packages/transformers/generation/utils.py", line 417, in load_custom_generate
is_local_code = os.path.exists(pretrained_model_name_or_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen genericpath>", line 19, in exists
TypeError: stat: path should be string, bytes, os.PathLike or integer, not function
```
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Run the following snippet:
```python
import torch
from transformers import AutoProcessor, Gemma3ForConditionalGeneration
ckpt = "google/gemma-3-4b-it"
model = Gemma3ForConditionalGeneration.from_pretrained(
ckpt, device_map="auto", torch_dtype=torch.bfloat16,
)
processor = AutoProcessor.from_pretrained(ckpt)
messages = [
{
"role": "user",
"content": [
{"type": "image", "url": "https://huggingface.co/spaces/big-vision/paligemma-hf/resolve/main/examples/password.jpg"},
{"type": "text", "text": "What is the password?"}
]
}
]
inputs = processor.apply_chat_template(
messages, add_generation_prompt=True, tokenize=True,
return_dict=True, return_tensors="pt"
).to(model.device)
gen_out = model.generate(
**inputs,
custom_generate="transformers-community/contrastive-search",
penalty_alpha=0.6,
top_k=4,
max_new_tokens=128,
trust_remote_code=True,
)
```
### Expected behavior
It should be able to generate? I don't understand why the custom generate function throws an error here.
Sorry if I'm missing something obvious!
|
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40926/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40926/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40925
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40925/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40925/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40925/events
|
https://github.com/huggingface/transformers/pull/40925
| 3,424,538,434
|
PR_kwDOCUB6oc6o-6b8
| 40,925
|
Fix outdated torch version check
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T03:47:43
| 2025-09-22T12:53:03
| 2025-09-22T12:38:08
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40925",
"html_url": "https://github.com/huggingface/transformers/pull/40925",
"diff_url": "https://github.com/huggingface/transformers/pull/40925.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40925.patch",
"merged_at": "2025-09-22T12:38:08"
}
|
# What does this PR do?
Fix outdated torch version check.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40925/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40925/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40924
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40924/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40924/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40924/events
|
https://github.com/huggingface/transformers/pull/40924
| 3,424,276,079
|
PR_kwDOCUB6oc6o-BKZ
| 40,924
|
Don't list dropout in eager_paged_attention_forward
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T02:01:41
| 2025-09-18T10:22:21
| 2025-09-18T09:05:50
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40924",
"html_url": "https://github.com/huggingface/transformers/pull/40924",
"diff_url": "https://github.com/huggingface/transformers/pull/40924.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40924.patch",
"merged_at": "2025-09-18T09:05:50"
}
|
# What does this PR do?
The `dropout` argument is not used in eager_paged_attention_forward.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40924/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40923
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40923/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40923/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40923/events
|
https://github.com/huggingface/transformers/pull/40923
| 3,424,176,382
|
PR_kwDOCUB6oc6o9sru
| 40,923
|
Wait for main process in _save_checkpoint to ensure best checkpoint exists
|
{
"login": "ssharpe42",
"id": 8136905,
"node_id": "MDQ6VXNlcjgxMzY5MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/8136905?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ssharpe42",
"html_url": "https://github.com/ssharpe42",
"followers_url": "https://api.github.com/users/ssharpe42/followers",
"following_url": "https://api.github.com/users/ssharpe42/following{/other_user}",
"gists_url": "https://api.github.com/users/ssharpe42/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ssharpe42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssharpe42/subscriptions",
"organizations_url": "https://api.github.com/users/ssharpe42/orgs",
"repos_url": "https://api.github.com/users/ssharpe42/repos",
"events_url": "https://api.github.com/users/ssharpe42/events{/privacy}",
"received_events_url": "https://api.github.com/users/ssharpe42/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-17T00:56:37
| 2025-09-30T09:41:04
| 2025-09-30T09:41:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40923",
"html_url": "https://github.com/huggingface/transformers/pull/40923",
"diff_url": "https://github.com/huggingface/transformers/pull/40923.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40923.patch",
"merged_at": "2025-09-30T09:41:03"
}
|
# What does this PR do?
In 4.50.0 a bug was introduced with a refactor of the best checkpoint process when we try to load the best model at end and a non-main process has a null `self.state.best_model_checkpoint`.
Before we run `_load_best_model()` there is a barrier to make sure the main process saves the model, however inside `_save_checkpoint`, this is missing. The model is saved [here](https://github.com/huggingface/transformers/blob/0b057e66b52556da3a1cbc29e2a98c0784ea9c33/src/transformers/trainer.py#L3197) and then a couple lines later the `self.state.best_model_checkpoint` is set only if the `best_checkpoint_dir` exists. When another process gets to this line first, `self.state.best_model_checkpoint` will be null.
Finally, when training reaches the end and it is time to load the best model [here](https://github.com/huggingface/transformers/blob/0b057e66b52556da3a1cbc29e2a98c0784ea9c33/src/transformers/trainer.py#L2666) only the main process enters the if statement and the dist.barrier causes it to wait forever for process 1 to arrive which doesn't have a best_model_checkpoint.
In 4.48.3 it didn't check for existence before assigning the best model checkpoint. They refactored how this works, I can't get it to fail on a non-cube example yet. https://github.com/huggingface/transformers/blob/298b3f19303294293f7af075609481d64cb13de3/src/transformers/trainer.py#L3190 (edited)
Note: I didn't use `self.accelerator.wait_for_everyone()` since it doesn't seem to cover the sagemaker case, but I could make a wait function in the trainer since it is used twice now.
I believe this fixes #38008
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@zach-huggingface, @SunMarc and @qgallouedec
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40923/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40923/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40922
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40922/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40922/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40922/events
|
https://github.com/huggingface/transformers/pull/40922
| 3,423,853,331
|
PR_kwDOCUB6oc6o8ocn
| 40,922
|
[DOC] Add missing dates in model cards
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T21:50:07
| 2025-09-17T15:17:06
| 2025-09-17T15:17:06
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40922",
"html_url": "https://github.com/huggingface/transformers/pull/40922",
"diff_url": "https://github.com/huggingface/transformers/pull/40922.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40922.patch",
"merged_at": "2025-09-17T15:17:06"
}
|
Cc @stevhliu ;)
I'm building a space to display a visual timeline of model releases in Transformers, happy to discuss this more and how we could integrate it to the docs once I have something working!
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40922/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40922/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40921
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40921/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40921/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40921/events
|
https://github.com/huggingface/transformers/pull/40921
| 3,423,827,302
|
PR_kwDOCUB6oc6o8iz0
| 40,921
|
Add FlexOlmo model
|
{
"login": "2015aroras",
"id": 19700980,
"node_id": "MDQ6VXNlcjE5NzAwOTgw",
"avatar_url": "https://avatars.githubusercontent.com/u/19700980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/2015aroras",
"html_url": "https://github.com/2015aroras",
"followers_url": "https://api.github.com/users/2015aroras/followers",
"following_url": "https://api.github.com/users/2015aroras/following{/other_user}",
"gists_url": "https://api.github.com/users/2015aroras/gists{/gist_id}",
"starred_url": "https://api.github.com/users/2015aroras/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/2015aroras/subscriptions",
"organizations_url": "https://api.github.com/users/2015aroras/orgs",
"repos_url": "https://api.github.com/users/2015aroras/repos",
"events_url": "https://api.github.com/users/2015aroras/events{/privacy}",
"received_events_url": "https://api.github.com/users/2015aroras/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T21:38:13
| 2025-09-18T17:16:41
| 2025-09-18T09:04:06
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40921",
"html_url": "https://github.com/huggingface/transformers/pull/40921",
"diff_url": "https://github.com/huggingface/transformers/pull/40921.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40921.patch",
"merged_at": "2025-09-18T09:04:06"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
This PR adds the implementation for the FlexOlmo models. The main architectural differences from OlmoE are:
- Norm is applied after attention/feedforward rather than before.
There are some extra changes in the modular file like using Olmo2's RMSNorm instead of OlmoE's so that lower precision is handled more faithfully. The modular file contains comments about such changes.
<!-- Remove if not applicable -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
@ArthurZucker
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40921/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40921/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40920
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40920/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40920/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40920/events
|
https://github.com/huggingface/transformers/pull/40920
| 3,423,773,212
|
PR_kwDOCUB6oc6o8XWM
| 40,920
|
Fix AttributeError: add num_hidden_layers property to T5GemmaConfig
|
{
"login": "avchauzov",
"id": 21357563,
"node_id": "MDQ6VXNlcjIxMzU3NTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/21357563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/avchauzov",
"html_url": "https://github.com/avchauzov",
"followers_url": "https://api.github.com/users/avchauzov/followers",
"following_url": "https://api.github.com/users/avchauzov/following{/other_user}",
"gists_url": "https://api.github.com/users/avchauzov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/avchauzov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avchauzov/subscriptions",
"organizations_url": "https://api.github.com/users/avchauzov/orgs",
"repos_url": "https://api.github.com/users/avchauzov/repos",
"events_url": "https://api.github.com/users/avchauzov/events{/privacy}",
"received_events_url": "https://api.github.com/users/avchauzov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T21:14:28
| 2025-09-21T00:46:19
| 2025-09-21T00:46:00
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40920",
"html_url": "https://github.com/huggingface/transformers/pull/40920",
"diff_url": "https://github.com/huggingface/transformers/pull/40920.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40920.patch",
"merged_at": null
}
|
# What does this PR do?
This PR fixes an `AttributeError` that occurs when using T5Gemma models with `Seq2SeqTrainer`. The issue arises because `DynamicCache` in `cache_utils.py` expects `config.num_hidden_layers` to exist, but `T5GemmaConfig` doesn't have this attribute.
**Changes:**
- Added `num_hidden_layers` property to `T5GemmaConfig` that returns `encoder.num_hidden_layers`
- Added test to verify the property works correctly and is read-only
**Problem:** When training T5Gemma models with `Seq2SeqTrainer`, the code fails with:
`AttributeError: 'T5GemmaConfig' object has no attribute 'num_hidden_layers'`
**Solution:** Follow the pattern used by other encoder-decoder models (ProphetNet, Funnel) by adding a computed property that returns the encoder's layer count.
Fixes #40901
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @SunMarc
(T5Gemma model configuration and Seq2SeqTrainer integration)
|
{
"login": "avchauzov",
"id": 21357563,
"node_id": "MDQ6VXNlcjIxMzU3NTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/21357563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/avchauzov",
"html_url": "https://github.com/avchauzov",
"followers_url": "https://api.github.com/users/avchauzov/followers",
"following_url": "https://api.github.com/users/avchauzov/following{/other_user}",
"gists_url": "https://api.github.com/users/avchauzov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/avchauzov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avchauzov/subscriptions",
"organizations_url": "https://api.github.com/users/avchauzov/orgs",
"repos_url": "https://api.github.com/users/avchauzov/repos",
"events_url": "https://api.github.com/users/avchauzov/events{/privacy}",
"received_events_url": "https://api.github.com/users/avchauzov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40920/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40919
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40919/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40919/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40919/events
|
https://github.com/huggingface/transformers/pull/40919
| 3,423,749,968
|
PR_kwDOCUB6oc6o8SOA
| 40,919
|
Standardize audio/vision embedding function name for multimodal models
|
{
"login": "jackzhxng",
"id": 32371937,
"node_id": "MDQ6VXNlcjMyMzcxOTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/32371937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackzhxng",
"html_url": "https://github.com/jackzhxng",
"followers_url": "https://api.github.com/users/jackzhxng/followers",
"following_url": "https://api.github.com/users/jackzhxng/following{/other_user}",
"gists_url": "https://api.github.com/users/jackzhxng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackzhxng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackzhxng/subscriptions",
"organizations_url": "https://api.github.com/users/jackzhxng/orgs",
"repos_url": "https://api.github.com/users/jackzhxng/repos",
"events_url": "https://api.github.com/users/jackzhxng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackzhxng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T21:06:20
| 2025-10-08T20:24:39
| 2025-09-18T08:45:04
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40919",
"html_url": "https://github.com/huggingface/transformers/pull/40919",
"diff_url": "https://github.com/huggingface/transformers/pull/40919.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40919.patch",
"merged_at": "2025-09-18T08:45:04"
}
|
# What does this PR do?
Make all multimodal models share the same function name for audio and vision. This function should encapsulate all of the encoder module code up to the fusion with the prompt embeddings when doing early fusion.
This makes it so that we can rely on this function name downstream:
- Audio: https://github.com/huggingface/optimum-executorch/blob/main/optimum/exporters/executorch/integrations.py#L132 (currently it is `get_audio_embeds` but we will change it to `get_audio_features` after this PR is landed.
- Vision: https://github.com/huggingface/optimum-executorch/blob/main/optimum/exporters/executorch/integrations.py#L78
All audio models now follow the `get_audio_features` API and early fusion pattern:
- [Granite Speech](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/granite_speech/modeling_granite_speech.py#L349)
- Voxtral (changed in this PR)
- [Qwen2.5 Omni](https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2_5_omni/modeling_qwen2_5_omni.py#L1720)
- [Gemma 3n](https://github.com/huggingface/transformers/blob/main/src/transformers/models/gemma3n/modular_gemma3n.py#L2457)
Most vision models already follow the `get_image_features` API and early fusion pattern (vision models taken from [this](https://github.com/huggingface/transformers/blob/main/src/transformers/models/auto/modeling_auto.py#L997) list). The few that don't are either unrelated / cross attention-based:
- [Aria](https://github.com/huggingface/transformers/blob/main/src/transformers/models/aria/modular_aria.py#L1356)
- [Aya](https://github.com/huggingface/transformers/blob/main/src/transformers/models/aya_vision/modular_aya_vision.py#L110)
- Blip - N/A, not image-text-to-text
- [Blip2](https://github.com/huggingface/transformers/blob/main/src/transformers/models/blip_2/modeling_blip_2.py#L1739)
- [Chameleon](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/chameleon/modeling_chameleon.py#L883)
- [Cohere2 Vision](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/cohere2_vision/modular_cohere2_vision.py#L164)
- [Deepseek VL](https://github.com/huggingface/transformers/blob/main/src/transformers/models/janus/modular_janus.py#L904C9-L904C27) (derived from Janus)
- Deepseek VL Hybrid - same as Deepseek VL
- [Emu3](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/emu3/modular_emu3.py#L933)
- Evolla - N/A, for proteins (?)
- [Florence2](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/florence2/modular_florence2.py#L1536) (this does image features -> seq2seq instead of decoder)
- [Fuyu](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/fuyu/modeling_fuyu.py#L136)
- [Gemma3](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/gemma3/modular_gemma3.py#L763) - already enabled on Optimum ET
- [Gemma3n](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/gemma3n/modular_gemma3n.py#L2250)
- Git - not image-text-to-text
- [GLM 4.1V](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/glm4v/modeling_glm4v.py#L1121)
- GLM 4.1V MOE - same as GLM 4.1V
- [GOT-OCR2](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/got_ocr2/modular_got_ocr2.py#L308)
- ❌ IDEFICS - cross attention
- [IDEFICS 2](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/idefics2/modeling_idefics2.py#L860)
- [IDEFICS 3](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/idefics3/modeling_idefics3.py#L610)
- [InstructBLIP](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/instructblip/modeling_instructblip.py#L1263)
- [InternVL](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/internvl/modular_internvl.py#L493)
- ❌ KOSMOS-2 - this one has [`get_image_features`](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/kosmos2/modeling_kosmos2.py#L1518) but after that it's structured a bit differently, needing to pass in a `image_embeds_position_mask`, will need some work to export
- ❌ KOSMOS-2.5 - same as above
- [Llama4](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/llama4/modeling_llama4.py#L1173)
- [LLaVA](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/llava/modeling_llava.py#L154)
- [LLaVA-NeXT](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/llava_next/modeling_llava_next.py#L349)
- [LLaVA-NeXT-Video](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/llava_next_video/modeling_llava_next_video.py#L401)
- [LLaVA-OneVision](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/llava_next_video/modeling_llava_next_video.py#L401)
- [Mistral 3](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/mistral3/modular_mistral3.py#L121)
- ❌ Mllama (Llama 3.2-Vision) - cross attention
- [Ovis2](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/ovis2/modular_ovis2.py#L228)
- [PaliGemma](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/paligemma/modeling_paligemma.py#L232)
- [PerceptionLM](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/perception_lm/modular_perception_lm.py#L149)
- Pix2Struct - not image-text-to-text
- Pixtral - same as LLaVA
- [Qwen2.5-VL](https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py#L1093)
- [Qwen2-VL](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py#L1093)
- [Qwen3-VL](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/qwen3_vl/modeling_qwen3_vl.py#L1050)
- Qwen3-VL MOE - same as Qwen3-VL
- [SmolVLM](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/smolvlm/modular_smolvlm.py#L198)
- ❌ UDOP
- [ViP-LLaVA](https://github.com/huggingface/transformers/blob/6e50a8afb2540ac1acaa4b62cf1dd5f1170f6511/src/transformers/models/vipllava/modular_vipllava.py#L75)
- ❌ VisionEncoderDecoderModel - not sure this model is but it's cross attention-based
Following this PR, we would like to move [`MultimodalTextToTextExportableModule`](https://github.com/huggingface/optimum-executorch/blob/main/optimum/exporters/executorch/integrations.py#L136) from Optimum ET to [`transformers/integrations/executorch.py`](https://github.com/huggingface/transformers/blob/main/src/transformers/integrations/executorch.py) and add multimodal export tests marked with `@pytest.mark.torch_export_test` for each multimodal model.
cc @ArthurZucker @zucchini-nlp
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40919/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 3,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40919/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40918
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40918/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40918/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40918/events
|
https://github.com/huggingface/transformers/pull/40918
| 3,423,196,190
|
PR_kwDOCUB6oc6o6aKP
| 40,918
|
Add Model Card for `GptOss`
|
{
"login": "ParagEkbote",
"id": 69567729,
"node_id": "MDQ6VXNlcjY5NTY3NzI5",
"avatar_url": "https://avatars.githubusercontent.com/u/69567729?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParagEkbote",
"html_url": "https://github.com/ParagEkbote",
"followers_url": "https://api.github.com/users/ParagEkbote/followers",
"following_url": "https://api.github.com/users/ParagEkbote/following{/other_user}",
"gists_url": "https://api.github.com/users/ParagEkbote/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParagEkbote/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParagEkbote/subscriptions",
"organizations_url": "https://api.github.com/users/ParagEkbote/orgs",
"repos_url": "https://api.github.com/users/ParagEkbote/repos",
"events_url": "https://api.github.com/users/ParagEkbote/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParagEkbote/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T17:46:51
| 2025-09-18T17:24:37
| 2025-09-18T17:24:37
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40918",
"html_url": "https://github.com/huggingface/transformers/pull/40918",
"diff_url": "https://github.com/huggingface/transformers/pull/40918.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40918.patch",
"merged_at": null
}
|
# What does this PR do?
Following the structure as described in #36979, I have updated the model card for `GptOss`. I've not added a quantization example due to complexity, feel free to suggest an example. Could you please review?
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40918/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40918/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40917
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40917/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40917/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40917/events
|
https://github.com/huggingface/transformers/pull/40917
| 3,423,134,075
|
PR_kwDOCUB6oc6o6M6L
| 40,917
|
🚨 [generate] update paligemma mask updates (and other assisted generation-related fixes)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T17:22:43
| 2025-10-03T09:33:59
| 2025-09-23T16:20:00
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40917",
"html_url": "https://github.com/huggingface/transformers/pull/40917",
"diff_url": "https://github.com/huggingface/transformers/pull/40917.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40917.patch",
"merged_at": "2025-09-23T16:20:00"
}
|
# What does this PR do?
🚨 BC-breaking: `paligemma` processor now returns `token_type_ids` by default. This is required to disambiguate forward passes, due to the bidirectional attention mask in the prompt. Advanced generation methods may run forward passes with prompt + generated tokens, so they will fail without `token_type_ids`.
__________________
This PR is originally aimed at fixing [two flaky tests](https://app.circleci.com/insights/github/huggingface/transformers/workflows/run_tests/tests?branch=pull/40887):
- `imageGPT` + test_prompt_lookup_decoding_matches_greedy_search -> skip the test, imageGPT has dodgy layer initialization. This is better documented in the skip;
- ⚠️ `paligemma2` + test_prompt_lookup_decoding_matches_greedy_search -> upstreams attention mask creation from gemma3 to paligemma, since their masking strategy is the same. This also improves standardization, as we got rid of some legacy code 💛 Fixing this actually required a cascade of changes (changes in `gemma` for `paligemma` -> `gemma`-dependent models also needed updates)
_________________
✅ slow paligemma tests passing
✅ slow paligemma2 tests passing (but there are no integration tests ⚠️ )
✅ no regressions on slow gemma tests (i.e. some failures, same as in `main`)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40917/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40917/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40916
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40916/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40916/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40916/events
|
https://github.com/huggingface/transformers/pull/40916
| 3,422,997,726
|
PR_kwDOCUB6oc6o5vUQ
| 40,916
|
Remove unused arguments
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T16:34:21
| 2025-09-23T11:41:44
| 2025-09-23T11:40:51
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40916",
"html_url": "https://github.com/huggingface/transformers/pull/40916",
"diff_url": "https://github.com/huggingface/transformers/pull/40916.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40916.patch",
"merged_at": "2025-09-23T11:40:51"
}
|
# What does this PR do?
Remove unused arguments.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40916/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40915
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40915/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40915/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40915/events
|
https://github.com/huggingface/transformers/issues/40915
| 3,422,963,061
|
I_kwDOCUB6oc7MBkV1
| 40,915
|
HfArgumentParser does not support peft.LoraConfig
|
{
"login": "romitjain",
"id": 11757603,
"node_id": "MDQ6VXNlcjExNzU3NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/romitjain",
"html_url": "https://github.com/romitjain",
"followers_url": "https://api.github.com/users/romitjain/followers",
"following_url": "https://api.github.com/users/romitjain/following{/other_user}",
"gists_url": "https://api.github.com/users/romitjain/gists{/gist_id}",
"starred_url": "https://api.github.com/users/romitjain/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/romitjain/subscriptions",
"organizations_url": "https://api.github.com/users/romitjain/orgs",
"repos_url": "https://api.github.com/users/romitjain/repos",
"events_url": "https://api.github.com/users/romitjain/events{/privacy}",
"received_events_url": "https://api.github.com/users/romitjain/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T16:23:56
| 2025-09-23T05:16:14
| 2025-09-23T05:16:14
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.57.0.dev0
- Platform: Linux-5.14.0-284.73.1.el9_2.x86_64-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: No
- GPU type: NVIDIA A100-SXM4-80GB
### Who can help?
@ydshieh (I am not really sure who to tag here)
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
from peft import LoraConfig # v0.17.1
from transformers import HfArgumentParser # Built from source
p = HfArgumentParser(dataclass_types=LoraConfig) # fails
```
### Expected behavior
I would expect LoraConfig to be supported by HfArgumentParser.
As I understand, this fails because HfArgumentParser does not support fields of type (`Optional[List[str], str]`).
Is there a plan to support such fields?
|
{
"login": "romitjain",
"id": 11757603,
"node_id": "MDQ6VXNlcjExNzU3NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11757603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/romitjain",
"html_url": "https://github.com/romitjain",
"followers_url": "https://api.github.com/users/romitjain/followers",
"following_url": "https://api.github.com/users/romitjain/following{/other_user}",
"gists_url": "https://api.github.com/users/romitjain/gists{/gist_id}",
"starred_url": "https://api.github.com/users/romitjain/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/romitjain/subscriptions",
"organizations_url": "https://api.github.com/users/romitjain/orgs",
"repos_url": "https://api.github.com/users/romitjain/repos",
"events_url": "https://api.github.com/users/romitjain/events{/privacy}",
"received_events_url": "https://api.github.com/users/romitjain/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40915/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40915/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40914
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40914/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40914/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40914/events
|
https://github.com/huggingface/transformers/pull/40914
| 3,422,870,136
|
PR_kwDOCUB6oc6o5TQ_
| 40,914
|
Add support for Florence-2 training
|
{
"login": "ducviet00",
"id": 24910916,
"node_id": "MDQ6VXNlcjI0OTEwOTE2",
"avatar_url": "https://avatars.githubusercontent.com/u/24910916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ducviet00",
"html_url": "https://github.com/ducviet00",
"followers_url": "https://api.github.com/users/ducviet00/followers",
"following_url": "https://api.github.com/users/ducviet00/following{/other_user}",
"gists_url": "https://api.github.com/users/ducviet00/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ducviet00/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ducviet00/subscriptions",
"organizations_url": "https://api.github.com/users/ducviet00/orgs",
"repos_url": "https://api.github.com/users/ducviet00/repos",
"events_url": "https://api.github.com/users/ducviet00/events{/privacy}",
"received_events_url": "https://api.github.com/users/ducviet00/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T15:57:23
| 2025-09-17T11:49:57
| 2025-09-17T11:49:57
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40914",
"html_url": "https://github.com/huggingface/transformers/pull/40914",
"diff_url": "https://github.com/huggingface/transformers/pull/40914.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40914.patch",
"merged_at": "2025-09-17T11:49:57"
}
|
# What does this PR do?
This PR adds support for Florence-2 training.
Without shifting tokens to the right, the model cannot compute the forward loss correctly because the decoder input IDs are not generated from the labels.
In the PR https://github.com/huggingface/transformers/pull/38188, I thought it was handled by the language model (bart) but it was not
## Who can review?
@zucchini-nlp @Cyrilvallez @SunMarc
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40914/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40914/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40913
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40913/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40913/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40913/events
|
https://github.com/huggingface/transformers/issues/40913
| 3,422,410,750
|
I_kwDOCUB6oc7L_df-
| 40,913
|
Setting chat_template when creating a processor does not change the chat template
|
{
"login": "NohTow",
"id": 38869395,
"node_id": "MDQ6VXNlcjM4ODY5Mzk1",
"avatar_url": "https://avatars.githubusercontent.com/u/38869395?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NohTow",
"html_url": "https://github.com/NohTow",
"followers_url": "https://api.github.com/users/NohTow/followers",
"following_url": "https://api.github.com/users/NohTow/following{/other_user}",
"gists_url": "https://api.github.com/users/NohTow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NohTow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NohTow/subscriptions",
"organizations_url": "https://api.github.com/users/NohTow/orgs",
"repos_url": "https://api.github.com/users/NohTow/repos",
"events_url": "https://api.github.com/users/NohTow/events{/privacy}",
"received_events_url": "https://api.github.com/users/NohTow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T13:53:39
| 2025-10-26T08:02:22
| 2025-10-26T08:02:22
|
CONTRIBUTOR
| null | null | null | null |
### System Info
Hello,
With `transformers == 4.55.3` (and below, it seems), passing `chat_template` as an args when creating the processor does not seems to change the chat template accordingly.
Few lines to reproduce:
```python
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct", chat_template="test")
print(processor.chat_template)
# Print the default chat template
```
When tracking the root cause, it seems that this line is [overriding the user args by what's resolved](https://github.com/huggingface/transformers/blob/88ba0f107eb8c96b34bbe664f17f07ce0c8c57b5/src/transformers/processing_utils.py#L1086) (kwargs["chat_template"] is correct until this line).
It is a bit odd because I tried to dig into the past and it seems to have been like that for a while.
Am I missing something? I tried digging as much as I could on my own but I do not really understand how the user template could not be overriden.
### Who can help?
@Rocketknight1 @zucchini-nlp
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Install transformers
Run
```python
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct", chat_template="test")
print(processor.chat_template)
# Print the default chat template
```
### Expected behavior
The processor.chat_template should be set to the one passed by the user when creating the processor
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40913/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/huggingface/transformers/issues/40913/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40912
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40912/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40912/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40912/events
|
https://github.com/huggingface/transformers/pull/40912
| 3,422,322,633
|
PR_kwDOCUB6oc6o3cfB
| 40,912
|
Fix dtype in Paligemma
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T13:31:08
| 2025-09-16T20:24:03
| 2025-09-16T16:07:56
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40912",
"html_url": "https://github.com/huggingface/transformers/pull/40912",
"diff_url": "https://github.com/huggingface/transformers/pull/40912.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40912.patch",
"merged_at": "2025-09-16T16:07:56"
}
|
# What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40875 and makes sure we have the same dtype before operations. The attention mask is used by LM and has to be the same dtype. Then we also need to cast VLM outputs because VLM and projection can be of different dtypes in configs
I still wonder if we're supposed to load the models with auto-dtype when doing `from_pretrained`, I found that it changed between 4.53 and 4.54 for now
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40912/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40912/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40911
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40911/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40911/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40911/events
|
https://github.com/huggingface/transformers/pull/40911
| 3,422,052,093
|
PR_kwDOCUB6oc6o2ht_
| 40,911
|
ENH: Enable readline support for transformers chat
|
{
"login": "BenjaminBossan",
"id": 6229650,
"node_id": "MDQ6VXNlcjYyMjk2NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6229650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BenjaminBossan",
"html_url": "https://github.com/BenjaminBossan",
"followers_url": "https://api.github.com/users/BenjaminBossan/followers",
"following_url": "https://api.github.com/users/BenjaminBossan/following{/other_user}",
"gists_url": "https://api.github.com/users/BenjaminBossan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BenjaminBossan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BenjaminBossan/subscriptions",
"organizations_url": "https://api.github.com/users/BenjaminBossan/orgs",
"repos_url": "https://api.github.com/users/BenjaminBossan/repos",
"events_url": "https://api.github.com/users/BenjaminBossan/events{/privacy}",
"received_events_url": "https://api.github.com/users/BenjaminBossan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T12:20:41
| 2025-09-19T09:39:22
| 2025-09-19T09:39:22
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40911",
"html_url": "https://github.com/huggingface/transformers/pull/40911",
"diff_url": "https://github.com/huggingface/transformers/pull/40911.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40911.patch",
"merged_at": "2025-09-19T09:39:22"
}
|
# What does this PR do?
This small change enables GNU readline support for the `transformers chat` command. This includes:
- advanced navigation and editing: `ctrl + a` `ctrl + e` `alt + b` `alt + f` `ctrl + k` `alt + d` etc.
- navigate and search history: `↑` `↓` `ctrl + p` `ctrl + n` `ctrl + r`
- undo: `ctrl + _`
- clear screen: `ctrl + l`
## Implementation
Although it may look strange, just [importing readline is enough to enable it in Python](https://docs.python.org/3/library/functions.html#input).
As readline is [not available on some platforms](https://docs.python.org/3/library/readline.html), the import is guarded.
Readline should work on Linux, MacOS, and with WSL, I'm not sure about Windows though. Ideally, someone can give it a try. It's possible that Windows users would have to install [pyreadline](https://pypi.org/project/pyreadline3/).
I checked the existing tests but I think there is no easy way to actually test this functionality. As it's basic Python functionality, I think we can live without further tests.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? => discussed internally
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40911/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40911/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40910
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40910/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40910/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40910/events
|
https://github.com/huggingface/transformers/issues/40910
| 3,421,942,661
|
I_kwDOCUB6oc7L9rOF
| 40,910
|
Gemma-3: prepare_inputs_for_generation should forward pixel_values based on image token presence, not cache_position==0
|
{
"login": "Simone999",
"id": 29517129,
"node_id": "MDQ6VXNlcjI5NTE3MTI5",
"avatar_url": "https://avatars.githubusercontent.com/u/29517129?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Simone999",
"html_url": "https://github.com/Simone999",
"followers_url": "https://api.github.com/users/Simone999/followers",
"following_url": "https://api.github.com/users/Simone999/following{/other_user}",
"gists_url": "https://api.github.com/users/Simone999/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Simone999/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Simone999/subscriptions",
"organizations_url": "https://api.github.com/users/Simone999/orgs",
"repos_url": "https://api.github.com/users/Simone999/repos",
"events_url": "https://api.github.com/users/Simone999/events{/privacy}",
"received_events_url": "https://api.github.com/users/Simone999/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null |
[] | 2025-09-16T11:52:40
| 2025-10-17T08:03:41
| null |
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.56.1
- Platform: Windows-11-10.0.26100-SP0
- Python version: 3.12.11
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu129 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA GeForce RTX 3060 Laptop GPU
### Who can help?
@zucchini-nlp
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
[Prefix caching](https://huggingface.co/docs/transformers/kv_cache#prefill-a-cache-prefix-caching) with models that use `Gemma3ForConditionalGeneration` is impossible when the non-cached prompt has an image due to code in `src/transformers/models/gemma3/modeling_gemma3.py:1175`
```python
# If we're in cached decoding stage, pixel values should be None because input ids do not contain special image token anymore
# Otherwise we need pixel values to be passed to model. NOTE: use_cache=False needs pixel_values always
if cache_position[0] == 0:
model_inputs["pixel_values"] = pixel_values
```
Step to reproduce:
```python
import copy
from typing import Any, Dict, List
import torch
from transformers import AutoModelForImageTextToText, AutoProcessor
model_id = f"google/medgemma-4b-it"
model = AutoModelForImageTextToText.from_pretrained(
model_id, dtype=torch.bfloat16, device_map="auto"
)
processor = AutoProcessor.from_pretrained(model_id)
# Generate prefix cache
system_instruction = "You are a helpful assistant.\n\nDescribe the following image."
conversation = [
{"role": "system", "content": [{"type": "text", "text": system_instruction}]},
{"role": "user", "content": ""},
]
tokens = processor.apply_chat_template(
conversation, add_generation_prompt=False, tokenize=True, return_tensors="pt"
)
eot_id = model.config.eos_token_id[-1]
eot_pos = torch.nonzero(tokens == eot_id).max()
initial_prompt = tokens[:, :eot_pos]
with torch.no_grad():
prompt_cache = model(input_ids=initial_prompt.to(model.device), use_cache=True).past_key_values
# Inference with PREFIX + USER PROMPT with image
def build_conversation(system_instruction: str, url: str) -> List[Dict[str, Any]]:
"""Builds chat messages with a system instruction and a single image."""
return [
{"role": "system", "content": [{"type": "text", "text": system_instruction}]},
{"role": "user", "content": [{"type": "image", "url": url}]},
]
image_url = "http://images.cocodataset.org/val2017/000000039769.jpg"
conversation = build_conversation(system_instruction, image_url)
inputs = processor.apply_chat_template(
conversation,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
)
with torch.inference_mode():
batch_inputs = inputs.to(model.device, dtype=model.dtype)
past_key_values = copy.deepcopy(prompt_cache)
generation = model.generate(
**batch_inputs,
past_key_values=past_key_values,
max_new_tokens=100,
do_sample=False,
)
```
After caching a prefix, when we pass a user turn that includes image tokens at non-zero `cache_position`, `prepare_inputs_for_generation` drops `pixel_values` (it only forwards them when `cache_position[0] == 0`).
### Expected behavior
* `prepare_inputs_for_generation` should **forward `pixel_values` whenever the current `input_ids` slice contains the model’s image special token(s)**, regardless of whether `cache_position[0]` is 0 or >0. I.e. `cache_position[0] == 0` is not a good way to understand if we are in the decoding stage.
* This enables a common workflow: cache a **text system prompt** once, then later process a **user turn with an image** without re-encoding the system text, while still exercising the vision path.
* A minimal change would be to replace the strict `cache_position[0] == 0` gate with a check on image token presence
* If `(input_ids == config.image_token_index).any():` → pass `pixel_values`.
* Else, omit `pixel_values` during decode.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40910/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40910/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40909
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40909/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40909/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40909/events
|
https://github.com/huggingface/transformers/pull/40909
| 3,421,794,031
|
PR_kwDOCUB6oc6o1rnB
| 40,909
|
disable `test_fast_is_faster_than_slow`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T11:16:23
| 2025-09-16T13:34:06
| 2025-09-16T13:34:05
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40909",
"html_url": "https://github.com/huggingface/transformers/pull/40909",
"diff_url": "https://github.com/huggingface/transformers/pull/40909.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40909.patch",
"merged_at": "2025-09-16T13:34:05"
}
|
# What does this PR do?
This test causes too much trouble and energy. As discussed offline, skip it until someone can check if this test makes sense on cpu and with small batch size.
Last time seeing such failure is 1 day ago:
https://app.circleci.com/pipelines/github/huggingface/transformers/146100/workflows/67352be5-2494-42cb-8613-c221afe0b6f5/jobs/1930135
> FAILED tests/models/efficientloftr/test_image_processing_efficientloftr.py::EfficientLoFTRImageProcessingTest::test_fast_is_faster_than_slow - AssertionError: 0.9246523380279541 not less than or equal to 0.8492918014526367 : Fast processor should not be significantly slower than slow processor
cc @zucchini-nlp
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40909/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40909/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40908
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40908/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40908/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40908/events
|
https://github.com/huggingface/transformers/pull/40908
| 3,421,627,219
|
PR_kwDOCUB6oc6o1LVe
| 40,908
|
Fix `load_balancing_loss_func` incompatible with `past_key_values`
|
{
"login": "tkj666",
"id": 28040169,
"node_id": "MDQ6VXNlcjI4MDQwMTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/28040169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tkj666",
"html_url": "https://github.com/tkj666",
"followers_url": "https://api.github.com/users/tkj666/followers",
"following_url": "https://api.github.com/users/tkj666/following{/other_user}",
"gists_url": "https://api.github.com/users/tkj666/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tkj666/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tkj666/subscriptions",
"organizations_url": "https://api.github.com/users/tkj666/orgs",
"repos_url": "https://api.github.com/users/tkj666/repos",
"events_url": "https://api.github.com/users/tkj666/events{/privacy}",
"received_events_url": "https://api.github.com/users/tkj666/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-09-16T10:38:52
| 2025-10-16T13:58:33
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40908",
"html_url": "https://github.com/huggingface/transformers/pull/40908",
"diff_url": "https://github.com/huggingface/transformers/pull/40908.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40908.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
Changes the way `num_hidden_layers`, `batch_size` and `sequence_length` are calculated, and slices `attention_mask`, so that the shapes of `expert_attention_mask` and `expert_mask` match, thus making it compatible with inference with `past_key_values`
<!-- Remove if not applicable -->
Fixes #30731
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40908/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40908/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40907
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40907/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40907/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40907/events
|
https://github.com/huggingface/transformers/pull/40907
| 3,421,590,114
|
PR_kwDOCUB6oc6o1ELt
| 40,907
|
[cache] Only use scalars in `get_mask_sizes`
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T10:30:21
| 2025-09-16T10:49:01
| 2025-09-16T10:48:59
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40907",
"html_url": "https://github.com/huggingface/transformers/pull/40907",
"diff_url": "https://github.com/huggingface/transformers/pull/40907.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40907.patch",
"merged_at": "2025-09-16T10:48:59"
}
|
# What does this PR do?
As per the title. We can rely on the scalar `self.cumulative_length` instead of tensor `cache_position[0]`, as was introduced in https://github.com/huggingface/transformers/pull/40893. It's much better for downstream masking and compilation support.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40907/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40906
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40906/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40906/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40906/events
|
https://github.com/huggingface/transformers/pull/40906
| 3,421,545,685
|
PR_kwDOCUB6oc6o07lb
| 40,906
|
[generate] misc fixes
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T10:20:36
| 2025-09-16T14:18:09
| 2025-09-16T14:18:06
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40906",
"html_url": "https://github.com/huggingface/transformers/pull/40906",
"diff_url": "https://github.com/huggingface/transformers/pull/40906.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40906.patch",
"merged_at": "2025-09-16T14:18:06"
}
|
# What does this PR do?
Fixes/todos for minor issues I encountered while working on #40833
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40906/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40906/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40905
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40905/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40905/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40905/events
|
https://github.com/huggingface/transformers/pull/40905
| 3,421,232,753
|
PR_kwDOCUB6oc6oz392
| 40,905
|
Set seed for `Glm4vIntegrationTest`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T09:04:38
| 2025-09-16T11:01:53
| 2025-09-16T11:01:51
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40905",
"html_url": "https://github.com/huggingface/transformers/pull/40905",
"diff_url": "https://github.com/huggingface/transformers/pull/40905.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40905.patch",
"merged_at": "2025-09-16T11:01:51"
}
|
# What does this PR do?
This model has `"do_sample": true,` in its `generation_config.json`, see
https://huggingface.co/zai-org/GLM-4.1V-9B-Thinking/blob/main/generation_config.json
We need to set seed, otherwise I will have nightmare of getting different outputs each day ... 😭
`Glm4vIntegrationTest` now all pass
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40905/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40905/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40904
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40904/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40904/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40904/events
|
https://github.com/huggingface/transformers/issues/40904
| 3,421,162,476
|
I_kwDOCUB6oc7L6svs
| 40,904
|
MXFP4 Tensor Core GEMM support in GPT-OSS for Blackwell GPUs
|
{
"login": "TheTinyTeddy",
"id": 171109504,
"node_id": "U_kgDOCjLsgA",
"avatar_url": "https://avatars.githubusercontent.com/u/171109504?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TheTinyTeddy",
"html_url": "https://github.com/TheTinyTeddy",
"followers_url": "https://api.github.com/users/TheTinyTeddy/followers",
"following_url": "https://api.github.com/users/TheTinyTeddy/following{/other_user}",
"gists_url": "https://api.github.com/users/TheTinyTeddy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TheTinyTeddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TheTinyTeddy/subscriptions",
"organizations_url": "https://api.github.com/users/TheTinyTeddy/orgs",
"repos_url": "https://api.github.com/users/TheTinyTeddy/repos",
"events_url": "https://api.github.com/users/TheTinyTeddy/events{/privacy}",
"received_events_url": "https://api.github.com/users/TheTinyTeddy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T08:46:30
| 2025-10-26T08:02:24
| 2025-10-26T08:02:24
|
NONE
| null | null | null | null |
Hi there,
I was looking at the code (in `triton_kernels/matmul_ogs_details/_matmul_ogs.py`) and found that even when using Blackwell GPU the Triton kernel that implements
`acc = tl.dot_scaled(x, x_scales, x_format, w, w_scales, w_format, acc=acc, fast_math=True)`
is actually doing BF16 GEMM rather than MXFP4 GEMM, and x_scales is actually None, i.e. no quantization for the activation.
Therefore, I was wondering will there be support for native MXFP4 GEMM in Transformers?
Many thanks
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40904/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40904/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40903
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40903/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40903/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40903/events
|
https://github.com/huggingface/transformers/pull/40903
| 3,421,039,480
|
PR_kwDOCUB6oc6ozNNG
| 40,903
|
Fix missing num_hidden_layers attribute in T5GemmaConfig
|
{
"login": "0xjeffro",
"id": 105006121,
"node_id": "U_kgDOBkJEKQ",
"avatar_url": "https://avatars.githubusercontent.com/u/105006121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/0xjeffro",
"html_url": "https://github.com/0xjeffro",
"followers_url": "https://api.github.com/users/0xjeffro/followers",
"following_url": "https://api.github.com/users/0xjeffro/following{/other_user}",
"gists_url": "https://api.github.com/users/0xjeffro/gists{/gist_id}",
"starred_url": "https://api.github.com/users/0xjeffro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/0xjeffro/subscriptions",
"organizations_url": "https://api.github.com/users/0xjeffro/orgs",
"repos_url": "https://api.github.com/users/0xjeffro/repos",
"events_url": "https://api.github.com/users/0xjeffro/events{/privacy}",
"received_events_url": "https://api.github.com/users/0xjeffro/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T08:17:57
| 2025-09-17T13:17:56
| 2025-09-17T13:17:55
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40903",
"html_url": "https://github.com/huggingface/transformers/pull/40903",
"diff_url": "https://github.com/huggingface/transformers/pull/40903.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40903.patch",
"merged_at": null
}
|
This is a quick fix for issue #40874. The `T5GemmaConfig` class was missing the `num_hidden_layers` attribute that cache initialization expects. Added `num_hidden_layers` property to expose the decoder's `num_hidden_layers` value.
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #40874
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40903/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40902
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40902/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40902/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40902/events
|
https://github.com/huggingface/transformers/pull/40902
| 3,420,865,719
|
PR_kwDOCUB6oc6oyneX
| 40,902
|
Fix flaky `Gemma3nAudioFeatureExtractionTest::test_dither`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T07:31:22
| 2025-09-16T09:00:09
| 2025-09-16T09:00:07
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40902",
"html_url": "https://github.com/huggingface/transformers/pull/40902",
"diff_url": "https://github.com/huggingface/transformers/pull/40902.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40902.patch",
"merged_at": "2025-09-16T09:00:07"
}
|
# What does this PR do?
> tests/models/gemma3n/test_feature_extraction_gemma3n.py::Gemma3nAudioFeatureExtractionTest::test_dither
is flaky since it is added in #39059, the failing ratio is 0.72 % (running 10K times).
I run it 50K times to get the maximal value for the difference, which could go up to `0.5`.
This PR set the tolerance to `0.8` to avoid flakiness.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40902/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40902/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40901
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40901/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40901/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40901/events
|
https://github.com/huggingface/transformers/issues/40901
| 3,420,767,024
|
I_kwDOCUB6oc7L5MMw
| 40,901
|
Cannot fine-tune T5Gemma with Seq2SeqTrainer
|
{
"login": "Crissium",
"id": 91039086,
"node_id": "MDQ6VXNlcjkxMDM5MDg2",
"avatar_url": "https://avatars.githubusercontent.com/u/91039086?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crissium",
"html_url": "https://github.com/Crissium",
"followers_url": "https://api.github.com/users/Crissium/followers",
"following_url": "https://api.github.com/users/Crissium/following{/other_user}",
"gists_url": "https://api.github.com/users/Crissium/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Crissium/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Crissium/subscriptions",
"organizations_url": "https://api.github.com/users/Crissium/orgs",
"repos_url": "https://api.github.com/users/Crissium/repos",
"events_url": "https://api.github.com/users/Crissium/events{/privacy}",
"received_events_url": "https://api.github.com/users/Crissium/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T07:04:53
| 2025-10-20T12:39:10
| 2025-10-20T12:39:10
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.56.1
- Platform: Linux-6.8.0-60-generic-x86_64-with-glibc2.39
- Python version: 3.12.11
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.3
- Accelerate version: 1.8.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: Parallel
- Using GPU in script?: Yes
- GPU type: NVIDIA H100 NVL
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Sample code:
```python
import datasets
import datetime
import jiwer
import os
import torch
import tracto
import transformers
data_date = '0901'
training_date = datetime.datetime.now().strftime('%m%d')
os.environ['WANDB_PROJECT'] = 'ASR Error Simulation'
os.environ['WANDB_NOTES'] = f'Model: T5Gemma Small UL2, Data: Splits{data_date}'
dataset_root = f'/mnt/users_home/cpii.local/yxing/Workspace/ASR/Data/Generated/Transcripts/Splits{data_date}'
ds_train = datasets.load_dataset(os.path.join(dataset_root, 'Train'), split='train')
ds_val = datasets.load_dataset(os.path.join(dataset_root, 'Val'), split='validation')
model = transformers.T5GemmaForConditionalGeneration.from_pretrained('google/t5gemma-s-s-ul2', attn_implementation='eager')
model.generation_config.max_new_tokens = 90
tokeniser = transformers.GemmaTokenizerFast.from_pretrained('google/t5gemma-s-s-ul2')
tokeniser.model_max_length = 90
def preprocess_data(examples):
return tokeniser(
examples['gt'],
text_target=examples['predicted'],
padding='max_length',
truncation=True
)
def normalise_text(text: str) -> str:
text = text.replace('-', ' ')
return ''.join(c for c in text.lower() if not tracto.is_punct(c) or c == "'")
def compute_wer_difference(eval_pred: transformers.EvalPrediction) -> dict[str, float]:
gt_text = tokeniser.batch_decode(eval_pred.inputs, skip_special_tokens=True)
original_pred = tokeniser.batch_decode(eval_pred.label_ids, skip_special_tokens=True)
model_pred = tokeniser.batch_decode(eval_pred.predictions, skip_special_tokens=True)
original_wer = jiwer.wer(list(map(normalise_text, gt_text)), list(map(normalise_text, original_pred)))
model_wer = jiwer.wer(list(map(normalise_text, gt_text)), list(map(normalise_text, model_pred)))
return {'original_wer': original_wer, 'model_wer': model_wer, 'wer_difference': original_wer - model_wer}
data_collator = transformers.DataCollatorForSeq2Seq(tokeniser, model=model)
training_args = transformers.Seq2SeqTrainingArguments(
run_name=f'{training_date} Small T5Gemma UL2',
report_to='wandb',
output_dir=f'../Checkpoints/{training_date} Small T5Gemma UL2',
eval_strategy='epoch',
save_strategy='epoch',
logging_strategy='epoch',
include_for_metrics=['inputs'],
num_train_epochs=500,
learning_rate=1e-5,
weight_decay=0.01,
predict_with_generate=True,
per_device_train_batch_size=150,
per_device_eval_batch_size=150,
ddp_find_unused_parameters=False
)
with training_args.main_process_first(desc='dataset map pre-processing'):
ds_train = ds_train.map(preprocess_data, batched=True)
ds_val = ds_val.map(preprocess_data, batched=True)
ds_train = ds_train.remove_columns([col for col in ds_train.column_names if col not in ['input_ids', 'attention_mask', 'labels']])
ds_val = ds_val.remove_columns([col for col in ds_val.column_names if col not in ['input_ids', 'attention_mask', 'labels']])
trainer = transformers.Seq2SeqTrainer(
model=model,
args=training_args,
data_collator=data_collator,
train_dataset=ds_train,
eval_dataset=ds_val,
compute_metrics=compute_wer_difference
)
trainer.train()
torch.distributed.destroy_process_group()
```
Sample data:
```json
{"source":"Common Voice","filename":"common_voice_en_41645384.mp3","gt":"Can all this be happening?","predicted":"Can all this be happening?"}
{"source":"Common Voice","filename":"common_voice_en_39702948.mp3","gt":"Whether he represented winter or summer is not quite clear.","predicted":"Whether he represented winter or summer is not quite clear."}
{"source":"Common Voice","filename":"common_voice_en_38381818.mp3","gt":"They are believed to have spoken an Iberian language.","predicted":"They are believed to have spoken an Iberian language."}
{"source":"Common Voice","filename":"common_voice_en_32260000.mp3","gt":"Glas Bheinn lies between Ullapool and Durness.","predicted":"The last binin lies between Olapu and Durnes."}
{"source":"Common Voice","filename":"common_voice_en_27647177.mp3","gt":"Cuba is in the fourth stage of demographic transition.","predicted":"Cuba is in the fourth stage of demographic transition."}
{"source":"Common Voice","filename":"common_voice_en_22373557.mp3","gt":"The terminology is made confusing by the etymology of these words.","predicted":"The terminology is made confusing by the etymology of these words."}
{"source":"Common Voice","filename":"common_voice_en_37213325.mp3","gt":"There is also the City Island Seaside Trolley run by the Bronx Tourism Council.","predicted":"There is also the City Island Seaside Trolley run by the Bronx Tourism Council."}
{"source":"Common Voice","filename":"common_voice_en_19687576.mp3","gt":"These went from Mauchline to the Isle of Wight.","predicted":"These went from McElwain to the Italy of Wight."}
{"source":"Common Voice","filename":"common_voice_en_30574664.mp3","gt":"The crater is very old and is crisscrossed by chains of secondary impact craters.","predicted":"The crater is very old and is crisscrossed by chains of secondary impact craters."}
{"source":"Common Voice","filename":"common_voice_en_19794969.mp3","gt":"Its motto, on the nameplate below the title, is One of America's Great Newspapers.","predicted":"Its motto on the nameplate below the title is one of America's great newspapers."}
```
Error:
```
[rank1]: Traceback (most recent call last):
[rank1]: File "/mnt/users_home/cpii.local/yxing/Workspace/ASR/Noise/src/train.py", line 86, in <module>
[rank1]: trainer.train()
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 2328, in train
[rank1]: return inner_training_loop(
[rank1]: ^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 2788, in _inner_training_loop
[rank1]: self._maybe_log_save_evaluate(
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 3227, in _maybe_log_save_evaluate
[rank1]: metrics = self._evaluate(trial, ignore_keys_for_eval)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 3176, in _evaluate
[rank1]: metrics = self.evaluate(ignore_keys=ignore_keys_for_eval)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 191, in evaluate
[rank1]: return super().evaluate(eval_dataset, ignore_keys=ignore_keys, metric_key_prefix=metric_key_prefix)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 4469, in evaluate
[rank1]: output = eval_loop(
[rank1]: ^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer.py", line 4665, in evaluation_loop
[rank1]: losses, logits, labels = self.prediction_step(model, inputs, prediction_loss_only, ignore_keys=ignore_keys)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/trainer_seq2seq.py", line 327, in prediction_step
[rank1]: generated_tokens = self.model.generate(**generation_inputs, **gen_kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
[rank1]: return func(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/generation/utils.py", line 2399, in generate
[rank1]: self._prepare_cache_for_generation(
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/generation/utils.py", line 2007, in _prepare_cache_for_generation
[rank1]: else EncoderDecoderCache(DynamicCache(**dynamic_cache_kwargs), DynamicCache(**dynamic_cache_kwargs))
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/cache_utils.py", line 1018, in __init__
[rank1]: for _ in range(config.num_hidden_layers)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/mnt/users_home/cpii.local/yxing/miniconda3/envs/g/lib/python3.12/site-packages/transformers/configuration_utils.py", line 207, in __getattribute__
[rank1]: return super().__getattribute__(key)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: AttributeError: 'T5GemmaConfig' object has no attribute 'num_hidden_layers'
```
### Expected behavior
`Seq2SeqTrainer` works with `T5GemmaForConditionalGeneration`.
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40901/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40901/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40900
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40900/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40900/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40900/events
|
https://github.com/huggingface/transformers/pull/40900
| 3,420,090,946
|
PR_kwDOCUB6oc6ov9EB
| 40,900
|
eneration: meta-safe _prepare_special_tokens + regression tests
|
{
"login": "moonrunnerkc",
"id": 125813226,
"node_id": "U_kgDOB3_B6g",
"avatar_url": "https://avatars.githubusercontent.com/u/125813226?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/moonrunnerkc",
"html_url": "https://github.com/moonrunnerkc",
"followers_url": "https://api.github.com/users/moonrunnerkc/followers",
"following_url": "https://api.github.com/users/moonrunnerkc/following{/other_user}",
"gists_url": "https://api.github.com/users/moonrunnerkc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/moonrunnerkc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/moonrunnerkc/subscriptions",
"organizations_url": "https://api.github.com/users/moonrunnerkc/orgs",
"repos_url": "https://api.github.com/users/moonrunnerkc/repos",
"events_url": "https://api.github.com/users/moonrunnerkc/events{/privacy}",
"received_events_url": "https://api.github.com/users/moonrunnerkc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 9258341780,
"node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop",
"name": "Code agent slop",
"color": "C59579",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T02:30:26
| 2025-09-16T12:26:48
| 2025-09-16T12:26:48
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40900",
"html_url": "https://github.com/huggingface/transformers/pull/40900",
"diff_url": "https://github.com/huggingface/transformers/pull/40900.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40900.patch",
"merged_at": null
}
|
# Summary
## What / Why
This PR makes `generation/utils.py::_prepare_special_tokens` **meta-safe**.
In assisted decoding, special-token tensors could be created on the `meta` device and then accessed via `.item()` or `.cpu().numpy()`, which triggers:
RuntimeError: Tensor.item() cannot be called on meta tensors
This patch avoids unsafe operations by rebuilding safe tensors on the requested device using Python IDs from `GenerationConfig`, and introduces a clear error type for unsupported cases.
- ✅ Adds **`MetaSafeTensorError`** for explicit failures instead of opaque framework errors.
- ✅ Hardens special-token setup so assisted decoding succeeds under concurrency and in meta-aware pipelines.
---
# Scope
### `src/transformers/generation/utils.py`
- Patch `_prepare_special_tokens` to be meta-safe.
- Fix internal helper for ID → tensor conversion (no `.item()` or `.cpu().numpy()` on meta).
- Add `MetaSafeTensorError` (subclass of `RuntimeError`) for unsupported meta ops.
### `tests/test_generation_meta.py`
- Add regression tests covering CPU path, meta path, output consistency, and no config drift.
> **Note:** No changes to public APIs. No behavioral change for non-meta paths.
---
# Details of the Fix
- Special token IDs provided as tensors on `meta` are **not moved or read directly**.
- Instead, fresh scalar tensors are **reconstructed on the requested device** using the underlying Python IDs from config.
- `.item()` / `.cpu().numpy()` are never called on meta tensors.
- If a non-scalar meta tensor is encountered without a safe conversion path, we raise `MetaSafeTensorError` with a descriptive message.
---
# Regression Tests
New tests in `tests/test_generation_meta.py`:
- **`test_prepare_special_tokens_cpu`** – CPU tensors work as before.
- **`test_prepare_special_tokens_meta`** – Meta tensors no longer raise; function completes.
- **`test_prepare_special_tokens_consistency`** – Outputs match between CPU and meta paths.
- **`test_no_drift_after_prepare`** – Confirms `GenerationConfig` is not mutated.
✅ All tests pass locally and in CI (`ubuntu-latest`, Python 3.10 & 3.12).
---
# Related
- Refs: #40739, #40740 (context: assistant-side parameters like `num_assistant_tokens` and assisted decoding flows).
- CC: @geoffrey-young
---
# Backward Compatibility
- No user-visible change for non-meta execution.
- Meta-aware execution paths are now robust: assisted decoding no longer crashes on `.item()` from meta tensors.
---
# Performance
- Negligible overhead — only touches scalar special-token handling during generation setup.
- No extra allocations beyond tiny scalar tensors when needed.
---
# Validation
- Local
pytest -q tests/test_generation_meta.py # PASS
- CI (GitHub Actions, ubuntu-latest, Py3.10/3.12)
Full test suite including new meta safety tests → PASS
- Concurrency probes
Assisted decoding succeeds with no config drift.
# Checklist
- Existing tests pass
- New tests added
- Ran make fixup (format/quality) locally
- No API changes / docs not required
- Minimal, well-scoped patch with regression coverage
# Notes for Reviewers
- Change is intentionally minimal and defensive only where necessary.
- MetaSafeTensorError makes failures explicit; happy to relocate to a shared errors module if preferred.
- Can also add a doc comment in GenerationConfig noting that special token IDs may be passed as ints or tensors (including meta), and are normalized during generation.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40900/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40900/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40899
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40899/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40899/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40899/events
|
https://github.com/huggingface/transformers/pull/40899
| 3,419,941,186
|
PR_kwDOCUB6oc6oveup
| 40,899
|
Don't report `num_input_tokens_seen` when disabled
|
{
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-16T00:48:27
| 2025-09-18T05:04:20
| 2025-09-18T05:04:20
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40899",
"html_url": "https://github.com/huggingface/transformers/pull/40899",
"diff_url": "https://github.com/huggingface/transformers/pull/40899.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40899.patch",
"merged_at": null
}
|
```python
>>> bool("no")
True
```
|
{
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40899/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40899/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40898
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40898/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40898/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40898/events
|
https://github.com/huggingface/transformers/pull/40898
| 3,419,701,618
|
PR_kwDOCUB6oc6ourMO
| 40,898
|
Adding [T5/MT5/UMT5]EncoderForSequenceClassification
|
{
"login": "cbhyphen",
"id": 12734117,
"node_id": "MDQ6VXNlcjEyNzM0MTE3",
"avatar_url": "https://avatars.githubusercontent.com/u/12734117?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cbhyphen",
"html_url": "https://github.com/cbhyphen",
"followers_url": "https://api.github.com/users/cbhyphen/followers",
"following_url": "https://api.github.com/users/cbhyphen/following{/other_user}",
"gists_url": "https://api.github.com/users/cbhyphen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cbhyphen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cbhyphen/subscriptions",
"organizations_url": "https://api.github.com/users/cbhyphen/orgs",
"repos_url": "https://api.github.com/users/cbhyphen/repos",
"events_url": "https://api.github.com/users/cbhyphen/events{/privacy}",
"received_events_url": "https://api.github.com/users/cbhyphen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-15T22:21:09
| 2025-10-15T04:25:17
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40898",
"html_url": "https://github.com/huggingface/transformers/pull/40898",
"diff_url": "https://github.com/huggingface/transformers/pull/40898.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40898.patch",
"merged_at": null
}
|
# What does this PR do?
This PR adds an encoder-only sequence classifier for T5. Inspiration for this comes from the following paper: ["Sentence-T5: Scalable Sentence Encoders from Pre-trained Text-to-Text Models"](https://arxiv.org/abs/2108.08877). The mean of final hidden states is used as the sentence representation (best results from paper). For `t5-small`, the encoder-only classifier is nearly half the size and takes nearly a third of the time for a forward pass compared to the [encoder-decoder classifier ](https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py#L1951).
Note that I tried to include this new class in `MODEL_FOR_SEQUENCE_CLASSIFICATION_MAPPING_NAMES` in [modeling_auto.py](https://github.com/huggingface/transformers/blob/main/src/transformers/models/auto/modeling_auto.py#L1188) but I could not get around one failing test `test_load_with_mismatched_shapes` in [test_modeling_common.py](https://github.com/huggingface/transformers/blob/main/tests/test_modeling_common.py#L3161). That test seems to invoke the model as a decoder and fails [here in the T5Stack class](https://github.com/huggingface/transformers/tree/main/src/transformers/models/t5#L991) with the following error: `ValueError: You have to specify either decoder_input_ids or decoder_inputs_embeds`. Because of this, I did not add to `modeling_auto.py` but if there is a need to do so, please let me know (any advice on how-to would be appreciated). Having noted that, this PR does include a small test in `test_modeling_t5.py`.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40898/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40898/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40897
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40897/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40897/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40897/events
|
https://github.com/huggingface/transformers/pull/40897
| 3,418,910,345
|
PR_kwDOCUB6oc6osBOM
| 40,897
|
docs: standardized GIT model card according to the issue #36979
|
{
"login": "Big-Marvel",
"id": 145830550,
"node_id": "U_kgDOCLEylg",
"avatar_url": "https://avatars.githubusercontent.com/u/145830550?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Big-Marvel",
"html_url": "https://github.com/Big-Marvel",
"followers_url": "https://api.github.com/users/Big-Marvel/followers",
"following_url": "https://api.github.com/users/Big-Marvel/following{/other_user}",
"gists_url": "https://api.github.com/users/Big-Marvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Big-Marvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Big-Marvel/subscriptions",
"organizations_url": "https://api.github.com/users/Big-Marvel/orgs",
"repos_url": "https://api.github.com/users/Big-Marvel/repos",
"events_url": "https://api.github.com/users/Big-Marvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/Big-Marvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T17:45:34
| 2025-09-18T17:25:07
| 2025-09-18T17:25:06
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40897",
"html_url": "https://github.com/huggingface/transformers/pull/40897",
"diff_url": "https://github.com/huggingface/transformers/pull/40897.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40897.patch",
"merged_at": null
}
|
# What does this PR do?
This PR adds a standardized model card for the **Generative Image-to-Text Transformer (GIT)** following the ongoing documentation cleanup and model card standardization effort.
Specifically, it:
* Creates `git.md` with standardized structure (badges placeholder, model overview, usage examples, quantization, attention visualization, notes, and autodoc sections).
* Adds runnable code snippets for pipelines (`image-to-text`, `visual-question-answering`), `AutoModel`, and CLI usage.
* Includes an **INT8 quantization example** with 🤗 Optimum.
* Adds an **attention visualization example** (using raw attention maps, since `AttentionMaskVisualizer` doesn’t support multimodal GIT).
* Fills in the **Notes** section with details on multimodal input, preprocessing, biases, efficiency, and license considerations, similar to other model doc updates.
* Standardizes autodoc placeholders for `GitVisionConfig`, `GitModel`, `GitForCausalLM`, etc.
This improves discoverability and usability of GIT models in the Transformers docs and aligns its card with recent model doc PRs.
Related to #36979
---
## Before submitting
* [x] This PR improves the docs.
* [x] I have read the [[contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request)](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request).
* [x] This was discussed in the Git model card standardization issue (link above).
* [x] Documentation updated with runnable code snippets.
* [ ] No tests required (doc-only change).
---
## Who can review?
* Documentation: @stevhliu
* Vision models: @amyeroberts, @qubvel
* Generate (vision-language models): @zucchini-nlp
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40897/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40896
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40896/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40896/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40896/events
|
https://github.com/huggingface/transformers/pull/40896
| 3,418,867,515
|
PR_kwDOCUB6oc6or37h
| 40,896
|
Remove reference to subclasses in modernbert
|
{
"login": "lematt1991",
"id": 13142923,
"node_id": "MDQ6VXNlcjEzMTQyOTIz",
"avatar_url": "https://avatars.githubusercontent.com/u/13142923?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lematt1991",
"html_url": "https://github.com/lematt1991",
"followers_url": "https://api.github.com/users/lematt1991/followers",
"following_url": "https://api.github.com/users/lematt1991/following{/other_user}",
"gists_url": "https://api.github.com/users/lematt1991/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lematt1991/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lematt1991/subscriptions",
"organizations_url": "https://api.github.com/users/lematt1991/orgs",
"repos_url": "https://api.github.com/users/lematt1991/repos",
"events_url": "https://api.github.com/users/lematt1991/events{/privacy}",
"received_events_url": "https://api.github.com/users/lematt1991/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T17:31:30
| 2025-09-17T15:36:55
| 2025-09-17T15:36:55
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40896",
"html_url": "https://github.com/huggingface/transformers/pull/40896",
"diff_url": "https://github.com/huggingface/transformers/pull/40896.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40896.patch",
"merged_at": null
}
|
# What does this PR do?
`ModernBertPretrainedModel` currently [references](https://github.com/huggingface/transformers/blob/main/src/transformers/models/modernbert/modular_modernbert.py#L805-L815) it's sub-classes when initializing weights. This breaks things when you try to create a new model that inherits from this class and use `utils/modular_model_converter.py`, since it will pull in the sub-classes as dependencies, for example:
```python
# This causes an error because `ModernBertPreTrainedModel` hasn't been defined yet!
class ModernBertForSequenceClassification(ModernBertPreTrainedModel)
....
class ModernBertPreTrainedModel(PreTrainedModel):
...
```
This removes any references to sub-classes inside of `ModernBertPreTrainedModel` breaking the circular reference.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
CC @ArthurZucker and @tomaarsen who reviewed #35158
|
{
"login": "lematt1991",
"id": 13142923,
"node_id": "MDQ6VXNlcjEzMTQyOTIz",
"avatar_url": "https://avatars.githubusercontent.com/u/13142923?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lematt1991",
"html_url": "https://github.com/lematt1991",
"followers_url": "https://api.github.com/users/lematt1991/followers",
"following_url": "https://api.github.com/users/lematt1991/following{/other_user}",
"gists_url": "https://api.github.com/users/lematt1991/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lematt1991/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lematt1991/subscriptions",
"organizations_url": "https://api.github.com/users/lematt1991/orgs",
"repos_url": "https://api.github.com/users/lematt1991/repos",
"events_url": "https://api.github.com/users/lematt1991/events{/privacy}",
"received_events_url": "https://api.github.com/users/lematt1991/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40896/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40896/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40895
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40895/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40895/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40895/events
|
https://github.com/huggingface/transformers/pull/40895
| 3,418,864,216
|
PR_kwDOCUB6oc6or3Nd
| 40,895
|
[generate] remove docs of a feature that no longer exists
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T17:30:16
| 2025-09-15T18:22:41
| 2025-09-15T18:22:32
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40895",
"html_url": "https://github.com/huggingface/transformers/pull/40895",
"diff_url": "https://github.com/huggingface/transformers/pull/40895.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40895.patch",
"merged_at": "2025-09-15T18:22:32"
}
|
# What does this PR do?
Addresses [this comment](https://github.com/huggingface/transformers/pull/36685#issuecomment-3289824309): end-to-end generation is no longer supported, so let's remove its docs.
(Thank you @vfdev-5)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40895/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40895/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40894
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40894/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40894/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40894/events
|
https://github.com/huggingface/transformers/pull/40894
| 3,418,840,200
|
PR_kwDOCUB6oc6orx93
| 40,894
|
Chat response parsing
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T17:21:48
| 2025-10-28T13:57:00
| 2025-10-21T16:26:18
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40894",
"html_url": "https://github.com/huggingface/transformers/pull/40894",
"diff_url": "https://github.com/huggingface/transformers/pull/40894.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40894.patch",
"merged_at": "2025-10-21T16:26:18"
}
|
This PR is a replacement for #39609. The idea is that models can include a message schema, allowing model output to be parsed into a structured form. The original plan was to allow parsing of the entire chat history, essentially the inverse operation of `apply_chat_template`, but the schemas involved were too complex and there was no realistic hope that users would be able to write them!
This PR simplifies things - we focus only on parsing the output generated by the model. This is mainly relevant for **tool calling** and **chain of thought** models, both of which emit structured output that often needs manual handling before it can be appended to the chat.
The output schema is stored as a key on the tokenizer. It consists of a JSON schema, representing the structure of messages emitted by the model, with additional `x-` keys that indicate how parsing should be performed. Parsing is mostly done through regexes, but there is also support for common tool call formats like `json` to be directly parsed without you having to write an entire JSON regex parser :sweat_smile:
Work to do:
- [x] Actually move parsing onto the tokenizer
- [x] Add support to TextGenerationPipeline
- [x] Add load/saving of output schemas
- [x] Document, document, document (50% done)
- [x] Figure out what we're calling it ("chat parsing"?)
- [x] Make sure extra fields don't break older versions
- [x] ~Support parsing in `Processor` classes too~ (Will move to separate PR)
- [x] ~Support parsing in `ImageTextToText` pipeline~ (Will move to separate PR)
- [x] Write schemas for some popular models
- [x] GPT-OSS
- [x] Cohere
- [x] ERNIE
- [x] ~Deepseek~ (tool calling isn't working in their template, will fix after)
- [x] SmolLM3
- [x] Qwen3
- [x] Qwen3-coder (requires `xml` support)
Documentation to do:
- [x] Expand `parse_response` explanation and show how it works with `TextGenerationPipeline`
- [x] Move the writing guide to a separate doc, and expand it?
- [x] Write the reference for the allowed `x-` fields
- [ ] Add method docstring and make sure it shows up in the docs
Open questions:
- [ ] Fold `chat_parsing_utils.py` into `chat_template_utils.py`?
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40894/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 3
}
|
https://api.github.com/repos/huggingface/transformers/issues/40894/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40893
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40893/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40893/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40893/events
|
https://github.com/huggingface/transformers/pull/40893
| 3,418,830,856
|
PR_kwDOCUB6oc6orv5o
| 40,893
|
[cache] Merge static sliding and static chunked layer
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T17:18:24
| 2025-09-16T10:11:28
| 2025-09-16T09:41:20
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40893",
"html_url": "https://github.com/huggingface/transformers/pull/40893",
"diff_url": "https://github.com/huggingface/transformers/pull/40893.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40893.patch",
"merged_at": "2025-09-16T09:41:20"
}
|
# What does this PR do?
As per the title. As discussed quite a few times, they are exactly the same, except the Chunked version is more general, as it can handle an arbitrary number of new tokens even after prefill (i.e. prefill caching, chat continuation etc...).
This PR merges them both, to only keep the more general version, which will improve the scope of SlidingWindowLayer usage with the aforementioned use-cases!
Thus Static and Dynamic caches can now be used exactly the same way, in all generality
cc @gante @manueldeprada as well for viz! Finally merging them!
I made sure slow tests on fully sliding (Mistral) and hybrid (Gemma2) are still fine!
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40893/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40893/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40892
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40892/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40892/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40892/events
|
https://github.com/huggingface/transformers/pull/40892
| 3,418,663,099
|
PR_kwDOCUB6oc6orK2U
| 40,892
|
Harmonize CacheLayer names
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T16:28:05
| 2025-09-16T10:14:39
| 2025-09-16T10:14:12
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40892",
"html_url": "https://github.com/huggingface/transformers/pull/40892",
"diff_url": "https://github.com/huggingface/transformers/pull/40892.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40892.patch",
"merged_at": "2025-09-16T10:14:12"
}
|
# What does this PR do?
As per the title. As it's only used internally in the Caches, it does not necessarily needs a deprecation cycle where we keep the old names IMO. We can however do it, will let you judge @ArthurZucker
cc @gante and @manueldeprada as well, we talked about it before!
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40892/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40892/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40891
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40891/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40891/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40891/events
|
https://github.com/huggingface/transformers/issues/40891
| 3,418,263,341
|
I_kwDOCUB6oc7Lvo8t
| 40,891
|
Need help for Applying Visual Prompt Tuning with Qwen2.5-VL vision
|
{
"login": "davidan208",
"id": 37769067,
"node_id": "MDQ6VXNlcjM3NzY5MDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/37769067?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/davidan208",
"html_url": "https://github.com/davidan208",
"followers_url": "https://api.github.com/users/davidan208/followers",
"following_url": "https://api.github.com/users/davidan208/following{/other_user}",
"gists_url": "https://api.github.com/users/davidan208/gists{/gist_id}",
"starred_url": "https://api.github.com/users/davidan208/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidan208/subscriptions",
"organizations_url": "https://api.github.com/users/davidan208/orgs",
"repos_url": "https://api.github.com/users/davidan208/repos",
"events_url": "https://api.github.com/users/davidan208/events{/privacy}",
"received_events_url": "https://api.github.com/users/davidan208/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T14:37:15
| 2025-10-25T08:02:15
| 2025-10-25T08:02:15
|
NONE
| null | null | null | null |
Hi everyone,
I am trying to add soft_prompt for Qwen2.5-VL vision
This is my code for doing this:
```
from transformers import Qwen2_5_VLModel, Qwen2_5_VLForConditionalGeneration, AutoProcessor
from transformers.models.qwen2_5_vl.modeling_qwen2_5_vl import (
Qwen2_5_VisionTransformerPretrainedModel,
Qwen2_5_VLTextModel,
Qwen2_5_VLPreTrainedModel
)
from transformers.models.qwen2_5_vl.configuration_qwen2_5_vl import (
Qwen2_5_VLVisionConfig,
Qwen2_5_VLConfig,
Qwen2_5_VLTextConfig
)
import torch
import torch.nn as nn
import torch.nn.functional as F
from qwen_vl_utils import process_vision_info
import os
class custom_Qwen2_5_VLVisionConfig(Qwen2_5_VLVisionConfig):
def __init__(self, soft_prompt = None, **kwargs):
super().__init__(**kwargs)
self.soft_prompt = soft_prompt
class custom_Qwen2_5_VLConfig(Qwen2_5_VLConfig):
sub_configs = {"vision_config": custom_Qwen2_5_VLVisionConfig, "text_config": Qwen2_5_VLTextConfig}
class custom_Qwen2_5_VLPreTrainedModel(Qwen2_5_VLPreTrainedModel):
config: custom_Qwen2_5_VLConfig
class custom_Qwen2_5_VisionTransformerPretrainedModel( custom_Qwen2_5_VLPreTrainedModel, Qwen2_5_VisionTransformerPretrainedModel):
config: custom_Qwen2_5_VLVisionConfig
def __init__(self, config, *inputs, **kwargs) -> None:
super().__init__(config, *inputs, **kwargs)
visual_soft_prompt = config.soft_prompt
base_dir = config.recall_path
if isinstance(visual_soft_prompt, list):
visual_soft_prompt = [os.path.normpath(os.path.join(base_dir, path)) for path in visual_soft_prompt]
if len(visual_soft_prompt) != 4:
raise "This is custom made for Qwen2.5-VL only"
sp7 = torch.load(visual_soft_prompt[0], map_location = "cpu", weights_only= True)
sp15 = torch.load(visual_soft_prompt[1], map_location = "cpu", weights_only= True)
sp23 = torch.load(visual_soft_prompt[2], map_location = "cpu", weights_only= True)
sp31 = torch.load(visual_soft_prompt[3], map_location = "cpu", weights_only= True)
print("=" * 50)
print(any(torch.isnan(tensor).any() for tensor in [sp7, sp15, sp23, sp31])) # False
print("=" * 50)
temp_shape = sp7.shape
if any(tensor_.shape != temp_shape for tensor_ in [sp7, sp15, sp23, sp31]):
raise ValueError("Soft prompts got different shape")
self.register_parameter('soft_prompt_visual_layer_7', nn.Parameter(sp7))
self.register_parameter('soft_prompt_visual_layer_15', nn.Parameter(sp15))
self.register_parameter('soft_prompt_visual_layer_23', nn.Parameter(sp23))
self.register_parameter('soft_prompt_visual_layer_31', nn.Parameter(sp31))
for name, param in self.named_parameters():
if 'soft_prompt' in name:
print(f" {name}: NaN={torch.isnan(param).any()}")
self.soft_prompts = {
"layer_7": self.soft_prompt_visual_layer_7,
"layer_15": self.soft_prompt_visual_layer_15,
"layer_23": self.soft_prompt_visual_layer_23,
"layer_31": self.soft_prompt_visual_layer_31,
}
self.num_prompts = temp_shape[0]
def forward(self, hidden_states: torch.Tensor, grid_thw: torch.Tensor, **kwargs) -> torch.Tensor:
hidden_states = self.patch_embed(hidden_states)
rotary_pos_emb = self.rot_pos_emb(grid_thw)
window_index, cu_window_seqlens = self.get_window_index(grid_thw)
cu_window_seqlens = torch.tensor(
cu_window_seqlens, device=hidden_states.device, dtype=torch.int32
)
cu_window_seqlens = torch.unique_consecutive(cu_window_seqlens)
seq_len, _ = hidden_states.size()
hidden_states = hidden_states.reshape(seq_len // self.spatial_merge_unit, self.spatial_merge_unit, -1)
hidden_states = hidden_states[window_index, :, :]
hidden_states = hidden_states.reshape(seq_len, -1)
rotary_pos_emb = rotary_pos_emb.reshape(seq_len // self.spatial_merge_unit, self.spatial_merge_unit, -1)
rotary_pos_emb = rotary_pos_emb[window_index, :, :]
rotary_pos_emb = rotary_pos_emb.reshape(seq_len, -1)
emb = torch.cat((rotary_pos_emb, rotary_pos_emb), dim=-1)
original_cos, original_sin = emb.cos(), emb.sin()
cu_seqlens = torch.repeat_interleave(grid_thw[:, 1] * grid_thw[:, 2], grid_thw[:, 0]).cumsum(dim=0, dtype=torch.int32)
cu_seqlens = F.pad(cu_seqlens, (1, 0), value=0)
head_dim = original_cos.shape[-1]
prompt_cos = torch.ones(self.num_prompts, head_dim, device=hidden_states.device, dtype=hidden_states.dtype)
prompt_sin = torch.zeros(self.num_prompts, head_dim, device=hidden_states.device, dtype=hidden_states.dtype)
for layer_num, blk in enumerate(self.blocks):
if layer_num in self.fullatt_block_indexes:
match layer_num:
case 7:
prompts_to_add = self.soft_prompts["layer_7"].squeeze(0).to(hidden_states.device, hidden_states.dtype)
case 15:
prompts_to_add = self.soft_prompts["layer_15"].squeeze(0).to(hidden_states.device, hidden_states.dtype)
case 23:
prompts_to_add = self.soft_prompts["layer_23"].squeeze(0).to(hidden_states.device, hidden_states.dtype)
case 31:
prompts_to_add = self.soft_prompts["layer_31"].squeeze(0).to(hidden_states.device, hidden_states.dtype)
current_hidden_states = torch.cat([prompts_to_add, hidden_states], dim=0)
final_cos = torch.cat([prompt_cos, original_cos], dim=0)
final_sin = torch.cat([prompt_sin, original_sin], dim=0)
current_pos_embeds = (final_cos, final_sin)
cu_seqlens_now = cu_seqlens.clone()
if cu_seqlens_now.numel() > 1:
cu_seqlens_now[1:] += self.num_prompts
output_hidden_states = blk(
current_hidden_states,
cu_seqlens=cu_seqlens_now,
position_embeddings=current_pos_embeds,
len_soft_prompts = self.num_prompts,
**kwargs,
)
hidden_states = output_hidden_states[self.num_prompts:, :]
else:
hidden_states = blk(
hidden_states,
cu_seqlens=cu_window_seqlens,
position_embeddings=(original_cos, original_sin),
len_soft_prompts = 0,
**kwargs,
)
hidden_states = self.merger(hidden_states)
reverse_indices = torch.argsort(window_index)
hidden_states = hidden_states[reverse_indices, :]
return hidden_states
class custom_Qwen2_5_VLModel(Qwen2_5_VLModel):
def __init__(self, config):
super().__init__(config)
self.config = config
self.config.vision_config.recall_path = getattr(self.config, "_name_or_path", None)
self.visual = custom_Qwen2_5_VisionTransformerPretrainedModel._from_config(config = config.vision_config)
self.language_model = Qwen2_5_VLTextModel._from_config(config.text_config)
class custom_Qwen2_5_VLForConditionalGeneration(Qwen2_5_VLForConditionalGeneration):
def __init__(self, config):
super().__init__(config)
self.model = custom_Qwen2_5_VLModel(config)
model = custom_Qwen2_5_VLForConditionalGeneration.from_pretrained(
"../2.5/",
torch_dtype = torch.float32,
low_cpu_mem_usage = True,
trust_remote_code = True,
device_map = "auto"
)
for name, param in model.named_parameters():
if 'soft_prompt' in name:
print(torch.isnan(param).any())
# tensor(False, device='mps:0')
# tensor(True, device='mps:0')
# tensor(True, device='mps:0')
# tensor(True, device='mps:0')
path = model.config.vision_config.recall_path + model.config.vision_config.soft_prompt[0][2:]
torch.load(path, map_location="cpu", weights_only=True)
# Parameter containing:
# tensor([[ 0.0072, -0.0287, -0.0344, ..., 0.0534, 0.0571, -0.0062],
# [-0.0627, 0.0050, -0.0253, ..., -0.0560, 0.0535, 0.0007],
# [ 0.0635, -0.0559, 0.0682, ..., -0.0279, -0.0299, -0.0121],
# ...,
# [ 0.0361, 0.0380, 0.0426, ..., -0.0013, -0.0570, -0.0163],
# [-0.0547, -0.0663, -0.0223, ..., -0.0156, 0.0142, 0.0348],
# [ 0.0061, -0.0403, -0.0181, ..., 0.0198, -0.0238, 0.0105]],
# requires_grad=True)
for name, param in model.named_parameters():
if 'soft_prompt_visual_layer_7' in name:
print(param)
# Parameter containing:
# tensor([[0., 0., 0., ..., 0., 0., 0.],
# [0., 0., 0., ..., 0., 0., 0.],
# [0., 0., 0., ..., 0., 0., 0.],
# ...,
# [0., 0., 0., ..., 0., 0., 0.],
# [0., 0., 0., ..., 0., 0., 0.],
# [0., 0., 0., ..., 0., 0., 0.]], device='mps:0', requires_grad=True)
```
I found that the parameter or register seems to create new all 0 tensors, or sometimes NaN which lead to error in.
I really need to apply this for a final year project, please help.
Thank you all
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40891/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40891/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40890
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40890/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40890/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40890/events
|
https://github.com/huggingface/transformers/pull/40890
| 3,418,154,358
|
PR_kwDOCUB6oc6opdry
| 40,890
|
Adding activation kernels
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T14:11:50
| 2025-09-17T13:48:17
| 2025-09-17T09:36:09
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40890",
"html_url": "https://github.com/huggingface/transformers/pull/40890",
"diff_url": "https://github.com/huggingface/transformers/pull/40890.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40890.patch",
"merged_at": "2025-09-17T09:36:09"
}
|
# What does this PR do?
Adds GeLU activation kernels from https://huggingface.co/kernels-community/activation, to use them we simply need to pass `use_kernels=True`
Here are Some benchmarks comparing the `activation kernels` perfs with a `torch.compile` implementation
<img width="959" height="378" alt="Screenshot 2025-09-16 at 10 30 52" src="https://github.com/user-attachments/assets/2e0d4aac-4972-4a3b-88f9-33278d45ec3c" />
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40890/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40890/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40889
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40889/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40889/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40889/events
|
https://github.com/huggingface/transformers/pull/40889
| 3,418,135,609
|
PR_kwDOCUB6oc6opZn4
| 40,889
|
Adapt and test huggingface_hub v1.0.0
|
{
"login": "Wauplin",
"id": 11801849,
"node_id": "MDQ6VXNlcjExODAxODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wauplin",
"html_url": "https://github.com/Wauplin",
"followers_url": "https://api.github.com/users/Wauplin/followers",
"following_url": "https://api.github.com/users/Wauplin/following{/other_user}",
"gists_url": "https://api.github.com/users/Wauplin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Wauplin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Wauplin/subscriptions",
"organizations_url": "https://api.github.com/users/Wauplin/orgs",
"repos_url": "https://api.github.com/users/Wauplin/repos",
"events_url": "https://api.github.com/users/Wauplin/events{/privacy}",
"received_events_url": "https://api.github.com/users/Wauplin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T14:07:11
| 2025-09-25T11:13:51
| 2025-09-25T11:13:50
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40889",
"html_url": "https://github.com/huggingface/transformers/pull/40889",
"diff_url": "https://github.com/huggingface/transformers/pull/40889.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40889.patch",
"merged_at": "2025-09-25T11:13:50"
}
|
Test as part of https://github.com/huggingface/huggingface_hub/issues/3340
**The main changes in `huggingface_hub` impacting transformers are:**
- `Repository` removed
- `HfFolder` removed
- migrated to `httpx` instead of `requests`
---
**List of changes made in this PR:**
- removed all imports that were commented with `# for backward compatibility`
- replaced `requests` exceptions by `httpx` ones in testing utils
- `Trainer` returns a `CommitInfo` (which is a subclass of `str`) => was already the case before but not type annotated
- `allow_redirects` => renamed `follow_redirects` in httpx
- `proxies` => argument is now ignored. Proxies must be set globally via env variable, not in script. If user passes a proxies, a warning is emitted and the process continues (ignoring the proxy)
- all `requests.HTTPError` are replaced by some `huggingface_hub.HfHubHTTPError` (which inherits from `httpx.HTTPError`)
- removed `test_dynamic_saving_from_local_repo` as it was commented as "# to be removed when huggingface_hub v1 comes out"
- removed all `HfFolder` occurrences from the tests => they were not relevant anyway
- migrated most "Repository clone from" by `snapshot_download` and "Repository push_to_hub" by `upload_folder`
- removed `ModelFilter` from `utils/update_tiny_models.py` (to be fair it has been removed since quite some time so this script is currently broken on `main`)
- switched a lot of `requests.get` into `httpx.get` in the code. However:
- it'd be very beneficial to define a single client for them to benefit from connection pooling => that's out of scope for this PR
- a LOT of `requests` calls is happening in `./models/` and `test_models/` so I won't change them => it'd be good in `transformers` v5 to get rid of `requests` dependency entirely => also out of scope for this PR
---
**TODO: (later?)**
- [ ] adapt `utils/create_dummy_models.py` => currently using `Repository`
- [x] ~remove back manual install of huggingface_hub 1.0.0.rc0 once `tokenizers` is out~
- [x] ~what to do with KerasCallback? => got deleted~
- [ ] in v5, all `use_auth_token` logic must be removed (currently throws warnings)
|
{
"login": "Wauplin",
"id": 11801849,
"node_id": "MDQ6VXNlcjExODAxODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wauplin",
"html_url": "https://github.com/Wauplin",
"followers_url": "https://api.github.com/users/Wauplin/followers",
"following_url": "https://api.github.com/users/Wauplin/following{/other_user}",
"gists_url": "https://api.github.com/users/Wauplin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Wauplin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Wauplin/subscriptions",
"organizations_url": "https://api.github.com/users/Wauplin/orgs",
"repos_url": "https://api.github.com/users/Wauplin/repos",
"events_url": "https://api.github.com/users/Wauplin/events{/privacy}",
"received_events_url": "https://api.github.com/users/Wauplin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40889/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40889/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40888
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40888/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40888/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40888/events
|
https://github.com/huggingface/transformers/pull/40888
| 3,418,022,604
|
PR_kwDOCUB6oc6opArC
| 40,888
|
DOC Fix help for chat and serve commands
|
{
"login": "BenjaminBossan",
"id": 6229650,
"node_id": "MDQ6VXNlcjYyMjk2NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6229650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BenjaminBossan",
"html_url": "https://github.com/BenjaminBossan",
"followers_url": "https://api.github.com/users/BenjaminBossan/followers",
"following_url": "https://api.github.com/users/BenjaminBossan/following{/other_user}",
"gists_url": "https://api.github.com/users/BenjaminBossan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BenjaminBossan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BenjaminBossan/subscriptions",
"organizations_url": "https://api.github.com/users/BenjaminBossan/orgs",
"repos_url": "https://api.github.com/users/BenjaminBossan/repos",
"events_url": "https://api.github.com/users/BenjaminBossan/events{/privacy}",
"received_events_url": "https://api.github.com/users/BenjaminBossan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-15T13:39:15
| 2025-10-01T10:54:18
| null |
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40888",
"html_url": "https://github.com/huggingface/transformers/pull/40888",
"diff_url": "https://github.com/huggingface/transformers/pull/40888.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40888.patch",
"merged_at": null
}
|
# What does this PR do?
For `transformers chat` and `transformers serve', the `load_in_8bit` and `load_in_4bit` arguments wrongly state that they require LoRA. This is only true when it comes to training but for inference, LoRA is not required.
Note: The failing test seems to be unrelated.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40888/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40888/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40887
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40887/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40887/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40887/events
|
https://github.com/huggingface/transformers/pull/40887
| 3,417,784,938
|
PR_kwDOCUB6oc6ooMbo
| 40,887
|
Refactor output handling in generate for cleaner decoding methods
|
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-15T12:36:13
| 2025-10-30T06:04:51
| null |
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40887",
"html_url": "https://github.com/huggingface/transformers/pull/40887",
"diff_url": "https://github.com/huggingface/transformers/pull/40887.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40887.patch",
"merged_at": null
}
|
Each decoding method has a common block of output handling boilerplate that worsens readability:
```
output_attentions = generation_config.output_attentions
output_hidden_states = generation_config.output_hidden_states
output_scores = generation_config.output_scores
output_logits = generation_config.output_logits
return_dict_in_generate = generation_config.return_dict_in_generate
# init attention / hidden states / scores tuples
scores = () if (return_dict_in_generate and output_scores) else None
raw_logits = () if (return_dict_in_generate and output_logits) else None
decoder_attentions = () if (return_dict_in_generate and output_attentions) else None
cross_attentions = () if (return_dict_in_generate and output_attentions) else None
decoder_hidden_states = () if (return_dict_in_generate and output_hidden_states) else None
# if model is an encoder-decoder, retrieve encoder attention weights and hidden states
if return_dict_in_generate and self.config.is_encoder_decoder:
encoder_attentions = model_kwargs["encoder_outputs"].get("attentions") if output_attentions else None
encoder_hidden_states = (
model_kwargs["encoder_outputs"].get("hidden_states") if output_hidden_states else None
)
...
while not finished:
# Store scores, attentions and hidden_states when required
if return_dict_in_generate:
if output_scores:
scores += (next_token_scores,)
if output_logits:
raw_logits += (next_token_logits,)
if output_attentions:
decoder_attentions += (
(outputs.decoder_attentions,) if self.config.is_encoder_decoder else (outputs.attentions,)
)
if self.config.is_encoder_decoder:
cross_attentions += (outputs.cross_attentions,)
if output_hidden_states:
decoder_hidden_states += (
(outputs.decoder_hidden_states,)
if self.config.is_encoder_decoder
else (outputs.hidden_states,)
)
...
if return_dict_in_generate:
if self.config.is_encoder_decoder:
return XXXEncoderDecoderOutput(
sequences=input_ids,
scores=scores,
logits=raw_logits,
encoder_attentions=encoder_attentions,
encoder_hidden_states=encoder_hidden_states,
decoder_attentions=decoder_attentions,
cross_attentions=cross_attentions,
decoder_hidden_states=decoder_hidden_states,
past_key_values=model_kwargs.get("past_key_values"),
)
else:
return XXXDecoderOnlyOutput(
sequences=input_ids,
scores=scores,
logits=raw_logits,
attentions=decoder_attentions,
hidden_states=decoder_hidden_states,
past_key_values=model_kwargs.get("past_key_values"),
)
else:
return input_ids
```
This PR takes that boilerplate to reusable generate helpers
TODO: add generalization so that users can say output_x and x from forward gets forwarded.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40887/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40886
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40886/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40886/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40886/events
|
https://github.com/huggingface/transformers/issues/40886
| 3,417,714,623
|
I_kwDOCUB6oc7Lti-_
| 40,886
|
not able to import Gemma3TextForSequenceClassification on transformers == '4.56.1'
|
{
"login": "rishavranaut",
"id": 141845222,
"node_id": "U_kgDOCHRi5g",
"avatar_url": "https://avatars.githubusercontent.com/u/141845222?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rishavranaut",
"html_url": "https://github.com/rishavranaut",
"followers_url": "https://api.github.com/users/rishavranaut/followers",
"following_url": "https://api.github.com/users/rishavranaut/following{/other_user}",
"gists_url": "https://api.github.com/users/rishavranaut/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rishavranaut/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rishavranaut/subscriptions",
"organizations_url": "https://api.github.com/users/rishavranaut/orgs",
"repos_url": "https://api.github.com/users/rishavranaut/repos",
"events_url": "https://api.github.com/users/rishavranaut/events{/privacy}",
"received_events_url": "https://api.github.com/users/rishavranaut/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T12:18:14
| 2025-09-16T08:40:30
| 2025-09-16T08:40:30
|
NONE
| null | null | null | null |
### System Info
i can see Gemma3TextForSequenceClassification implemented in modeling_gemma3.py but getting this error when trying to import this way from transformers import Gemma3TextForSequenceClassification..
ImportError: cannot import name 'Gemma3TextForSequenceClassification' from 'transformers' __init__.py
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
help with error above
### Expected behavior
help with the error
|
{
"login": "rishavranaut",
"id": 141845222,
"node_id": "U_kgDOCHRi5g",
"avatar_url": "https://avatars.githubusercontent.com/u/141845222?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rishavranaut",
"html_url": "https://github.com/rishavranaut",
"followers_url": "https://api.github.com/users/rishavranaut/followers",
"following_url": "https://api.github.com/users/rishavranaut/following{/other_user}",
"gists_url": "https://api.github.com/users/rishavranaut/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rishavranaut/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rishavranaut/subscriptions",
"organizations_url": "https://api.github.com/users/rishavranaut/orgs",
"repos_url": "https://api.github.com/users/rishavranaut/repos",
"events_url": "https://api.github.com/users/rishavranaut/events{/privacy}",
"received_events_url": "https://api.github.com/users/rishavranaut/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40886/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40886/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40885
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40885/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40885/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40885/events
|
https://github.com/huggingface/transformers/pull/40885
| 3,417,591,821
|
PR_kwDOCUB6oc6onhzK
| 40,885
|
[Docs] Adding documentation of MXFP4 Quantization
|
{
"login": "ariG23498",
"id": 36856589,
"node_id": "MDQ6VXNlcjM2ODU2NTg5",
"avatar_url": "https://avatars.githubusercontent.com/u/36856589?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ariG23498",
"html_url": "https://github.com/ariG23498",
"followers_url": "https://api.github.com/users/ariG23498/followers",
"following_url": "https://api.github.com/users/ariG23498/following{/other_user}",
"gists_url": "https://api.github.com/users/ariG23498/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ariG23498/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ariG23498/subscriptions",
"organizations_url": "https://api.github.com/users/ariG23498/orgs",
"repos_url": "https://api.github.com/users/ariG23498/repos",
"events_url": "https://api.github.com/users/ariG23498/events{/privacy}",
"received_events_url": "https://api.github.com/users/ariG23498/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T11:44:40
| 2025-09-16T18:31:28
| 2025-09-16T18:31:28
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40885",
"html_url": "https://github.com/huggingface/transformers/pull/40885",
"diff_url": "https://github.com/huggingface/transformers/pull/40885.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40885.patch",
"merged_at": "2025-09-16T18:31:28"
}
|
The documentation is taken from hf.co/blog/faster-transformers.
@stevhliu it would be great if you could get me an initial review. I would love to make it more aligned to what we usually do with documentation like these.
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40885/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40885/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40884
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40884/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40884/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40884/events
|
https://github.com/huggingface/transformers/pull/40884
| 3,417,535,615
|
PR_kwDOCUB6oc6onVac
| 40,884
|
Any to any pipeline and auto-mapping
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-15T11:27:54
| 2025-10-16T17:38:12
| null |
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40884",
"html_url": "https://github.com/huggingface/transformers/pull/40884",
"diff_url": "https://github.com/huggingface/transformers/pull/40884.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40884.patch",
"merged_at": null
}
|
# What does this PR do?
Adds any-to-any as a pipeline and in auto classes so that we can have a single mapping for all multimodal models. The model mapping is almost same as image-text-to-text, with inclusion of audio-LLM and omni-LLM. I hope I added all audio models, but lmk if anything is missing from recent ones
Fixes https://github.com/huggingface/transformers/issues/40302 and fixes https://github.com/huggingface/transformers/issues/37794
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40884/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40884/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40883
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40883/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40883/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40883/events
|
https://github.com/huggingface/transformers/pull/40883
| 3,417,435,598
|
PR_kwDOCUB6oc6om_LM
| 40,883
|
Fix modular consistency
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T10:58:39
| 2025-09-15T11:11:11
| 2025-09-15T11:07:08
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40883",
"html_url": "https://github.com/huggingface/transformers/pull/40883",
"diff_url": "https://github.com/huggingface/transformers/pull/40883.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40883.patch",
"merged_at": "2025-09-15T11:07:08"
}
|
# What does this PR do?
Reapply modular based on latest change in main (race condition when merging qwen3-vl PR)
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40883/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40883/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40882
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40882/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40882/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40882/events
|
https://github.com/huggingface/transformers/pull/40882
| 3,417,408,162
|
PR_kwDOCUB6oc6om5J9
| 40,882
|
Remove dict branch of attention_mask in sdpa_attention_paged_forward
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T10:51:57
| 2025-09-16T16:24:30
| 2025-09-15T15:38:13
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40882",
"html_url": "https://github.com/huggingface/transformers/pull/40882",
"diff_url": "https://github.com/huggingface/transformers/pull/40882.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40882.patch",
"merged_at": "2025-09-15T15:38:13"
}
|
# What does this PR do?
attention_mask should be an optional tensor.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40882/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40881
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40881/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40881/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40881/events
|
https://github.com/huggingface/transformers/pull/40881
| 3,417,234,200
|
PR_kwDOCUB6oc6omTN1
| 40,881
|
Update model tags and integration references in bug report
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T10:04:08
| 2025-09-15T10:13:21
| 2025-09-15T10:08:29
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40881",
"html_url": "https://github.com/huggingface/transformers/pull/40881",
"diff_url": "https://github.com/huggingface/transformers/pull/40881.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40881.patch",
"merged_at": "2025-09-15T10:08:29"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40881/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40881/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40880
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40880/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40880/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40880/events
|
https://github.com/huggingface/transformers/pull/40880
| 3,417,172,630
|
PR_kwDOCUB6oc6omF1e
| 40,880
|
Remove `runner_map`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T09:49:28
| 2025-09-16T13:18:10
| 2025-09-16T13:18:07
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40880",
"html_url": "https://github.com/huggingface/transformers/pull/40880",
"diff_url": "https://github.com/huggingface/transformers/pull/40880.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40880.patch",
"merged_at": "2025-09-16T13:18:07"
}
|
# What does this PR do?
This is added when we switched from T4 to A10. We tried to do it in a progressive way but ended up doing it in one-go (after a few days) because it was confusing about the results, in particular during the debug and fix phase.
That change also caused the `fsdp/traiiner` job not being run due to a mistake where
> echo "runner_map=$(python3 ../utils/get_runner_map.py)" >> $GITHUB_OUTPUT
is not added to the `elif` branch below
```yaml
run: |
if [ "${{ inputs.job }}" = "run_models_gpu" ]; then
echo "folder_slices=$(python3 ../utils/split_model_tests.py --models '${{ inputs.models }}' --num_splits ${{ env.NUM_SLICES }})" >> $GITHUB_OUTPUT
echo "slice_ids=$(python3 -c 'd = list(range(${{ env.NUM_SLICES }})); print(d)')" >> $GITHUB_OUTPUT
echo "runner_map=$(python3 ../utils/get_runner_map.py)" >> $GITHUB_OUTPUT
elif [ "${{ inputs.job }}" = "run_trainer_and_fsdp_gpu" ]; then
echo "folder_slices=[['trainer'], ['fsdp']]" >> $GITHUB_OUTPUT
echo "slice_ids=[0, 1]" >> $GITHUB_OUTPUT
fi
```
Since we don't need this anymore, let's just remove it.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40880/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40880/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40879
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40879/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40879/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40879/events
|
https://github.com/huggingface/transformers/pull/40879
| 3,417,167,384
|
PR_kwDOCUB6oc6omEq9
| 40,879
|
[TimesFM] add TimesFM 2.5
|
{
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
"following_url": "https://api.github.com/users/kashif/following{/other_user}",
"gists_url": "https://api.github.com/users/kashif/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kashif/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kashif/subscriptions",
"organizations_url": "https://api.github.com/users/kashif/orgs",
"repos_url": "https://api.github.com/users/kashif/repos",
"events_url": "https://api.github.com/users/kashif/events{/privacy}",
"received_events_url": "https://api.github.com/users/kashif/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T09:48:14
| 2025-09-24T10:24:25
| 2025-09-24T10:24:25
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40879",
"html_url": "https://github.com/huggingface/transformers/pull/40879",
"diff_url": "https://github.com/huggingface/transformers/pull/40879.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40879.patch",
"merged_at": null
}
|
# What does this PR do?
Add TimesFM 2.5 model
|
{
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
"following_url": "https://api.github.com/users/kashif/following{/other_user}",
"gists_url": "https://api.github.com/users/kashif/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kashif/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kashif/subscriptions",
"organizations_url": "https://api.github.com/users/kashif/orgs",
"repos_url": "https://api.github.com/users/kashif/repos",
"events_url": "https://api.github.com/users/kashif/events{/privacy}",
"received_events_url": "https://api.github.com/users/kashif/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40879/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40879/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40878
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40878/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40878/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40878/events
|
https://github.com/huggingface/transformers/pull/40878
| 3,417,102,746
|
PR_kwDOCUB6oc6ol2no
| 40,878
|
Fix deta loading & dataclass
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-15T09:31:47
| 2025-09-15T15:23:14
| 2025-09-15T15:23:13
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40878",
"html_url": "https://github.com/huggingface/transformers/pull/40878",
"diff_url": "https://github.com/huggingface/transformers/pull/40878.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40878.patch",
"merged_at": "2025-09-15T15:23:13"
}
|
# What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/40853. Always a bad idea to reattribute the module's data, should simply be manipulated in-place. We should not even have initialization schemes in `__init__`, but as the model is marked as deprecated, I only fixed it quickly instead of refactoring
Also fix the dataclass: without annotation, the field is not added in the __init__, only as a class attribute, which later crashes...
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40878/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40878/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40877
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40877/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40877/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40877/events
|
https://github.com/huggingface/transformers/pull/40877
| 3,415,803,263
|
PR_kwDOCUB6oc6ohfLm
| 40,877
|
Bug #40833: Fix for kv_offset calculation for mixed padding
|
{
"login": "preethamyerramsetty",
"id": 135053952,
"node_id": "U_kgDOCAzCgA",
"avatar_url": "https://avatars.githubusercontent.com/u/135053952?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/preethamyerramsetty",
"html_url": "https://github.com/preethamyerramsetty",
"followers_url": "https://api.github.com/users/preethamyerramsetty/followers",
"following_url": "https://api.github.com/users/preethamyerramsetty/following{/other_user}",
"gists_url": "https://api.github.com/users/preethamyerramsetty/gists{/gist_id}",
"starred_url": "https://api.github.com/users/preethamyerramsetty/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/preethamyerramsetty/subscriptions",
"organizations_url": "https://api.github.com/users/preethamyerramsetty/orgs",
"repos_url": "https://api.github.com/users/preethamyerramsetty/repos",
"events_url": "https://api.github.com/users/preethamyerramsetty/events{/privacy}",
"received_events_url": "https://api.github.com/users/preethamyerramsetty/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-15T00:18:46
| 2025-09-15T09:26:30
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40877",
"html_url": "https://github.com/huggingface/transformers/pull/40877",
"diff_url": "https://github.com/huggingface/transformers/pull/40877.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40877.patch",
"merged_at": null
}
|
# What does this PR do?
This fixes the kv_offset calculation in `cache_utils.py` in order to handle left and mixed padding correctly. Previously in case of mixed left and right padding the model could attend to padded tokens which results in incorrect response.
This PR ensures that correct offset is used for left padding, whereas right and mixed paddingfallback to offset 0 and "test_cache_utils.py" is also added to test all padding scenarios.
Fixes #40833
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case - Fixes #40833
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests? - (added unit test in tests/test_cache_utils.py)
## Who can review?
Anyone in the community is free to review.
Suggested reviewer: @giulio98
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40877/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40876
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40876/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40876/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40876/events
|
https://github.com/huggingface/transformers/pull/40876
| 3,415,123,305
|
PR_kwDOCUB6oc6ofTFH
| 40,876
|
Update shieldgemma2 model card
|
{
"login": "BryanBradfo",
"id": 101939095,
"node_id": "U_kgDOBhN3lw",
"avatar_url": "https://avatars.githubusercontent.com/u/101939095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BryanBradfo",
"html_url": "https://github.com/BryanBradfo",
"followers_url": "https://api.github.com/users/BryanBradfo/followers",
"following_url": "https://api.github.com/users/BryanBradfo/following{/other_user}",
"gists_url": "https://api.github.com/users/BryanBradfo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BryanBradfo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BryanBradfo/subscriptions",
"organizations_url": "https://api.github.com/users/BryanBradfo/orgs",
"repos_url": "https://api.github.com/users/BryanBradfo/repos",
"events_url": "https://api.github.com/users/BryanBradfo/events{/privacy}",
"received_events_url": "https://api.github.com/users/BryanBradfo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-14T14:36:17
| 2025-09-18T17:19:26
| 2025-09-18T17:19:26
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40876",
"html_url": "https://github.com/huggingface/transformers/pull/40876",
"diff_url": "https://github.com/huggingface/transformers/pull/40876.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40876.patch",
"merged_at": null
}
|
# What does this PR do?
This pull request updates the shieldgemma2.md model card to align with the new standardized format, as requested in issue https://github.com/huggingface/transformers/issues/36979.
The main changes include:
- Restructuring the document to follow the new standard layout.
- Adding a comprehensive code example for AutoModel usage.
- Introducing a detailed Quantization section with a fully runnable 4-bit bitsandbytes example.
- Improving the model description and organizing model-specific notes for better clarity.
I have personally tested and validated the Quantization examples on a GPU environment. They are confirmed to be working correctly. However, due to hardware limitations, I was unable to add and test the AutoModel example. The priority was to provide a robust and memory-efficient quantization example that a wider range of users can run.
This PR is a starting point, and I'm eager to learn from your feedback to complete it.
Fixes https://github.com/huggingface/transformers/issues/36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Based on the contribution guide for documentation: cc @stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40876/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40876/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40875
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40875/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40875/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40875/events
|
https://github.com/huggingface/transformers/issues/40875
| 3,415,067,401
|
I_kwDOCUB6oc7LjcsJ
| 40,875
|
ColPaliForRetrieval errors out when loaded in half precision dtypes
|
{
"login": "merveenoyan",
"id": 53175384,
"node_id": "MDQ6VXNlcjUzMTc1Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/merveenoyan",
"html_url": "https://github.com/merveenoyan",
"followers_url": "https://api.github.com/users/merveenoyan/followers",
"following_url": "https://api.github.com/users/merveenoyan/following{/other_user}",
"gists_url": "https://api.github.com/users/merveenoyan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/merveenoyan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/merveenoyan/subscriptions",
"organizations_url": "https://api.github.com/users/merveenoyan/orgs",
"repos_url": "https://api.github.com/users/merveenoyan/repos",
"events_url": "https://api.github.com/users/merveenoyan/events{/privacy}",
"received_events_url": "https://api.github.com/users/merveenoyan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-14T13:52:53
| 2025-09-16T16:07:57
| 2025-09-16T16:07:57
|
CONTRIBUTOR
| null | null | null | null |
### System Info
transformers version: transformers==4.56.1
Here's the error, this can be fixed by setting dtype to float32. float16 and bfloat16 won't work.
```
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
[/tmp/ipython-input-2540780684.py](https://localhost:8080/#) in <cell line: 0>()
2 image_inputs = processor(images=images)
3 image_inputs = image_inputs.to(model.device, model.dtype)
----> 4 image_outputs = model(**image_inputs)
5 image_embeddings_torch = image_outputs.embeddings
23 frames
[/usr/local/lib/python3.12/dist-packages/transformers/integrations/sdpa_attention.py](https://localhost:8080/#) in sdpa_attention_forward(module, query, key, value, attention_mask, dropout, scaling, is_causal, **kwargs)
81 is_causal = is_causal.item()
82
---> 83 attn_output = torch.nn.functional.scaled_dot_product_attention(
84 query,
85 key,
RuntimeError: Expected attn_mask dtype to be bool or float or to match query dtype, but got attn_mask.dtype: c10::Half and query.dtype: float instead.
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from transformers import ColPaliForRetrieval, ColPaliProcessor, infer_device
import torch
device = infer_device()
model = ColPaliForRetrieval.from_pretrained(
"vidore/colpali-v1.3-hf",
dtype=torch.float16, # can also be bfloat16
).to(device)
processor = ColPaliProcessor.from_pretrained("vidore/colpali-v1.3-hf")
with torch.no_grad():
image_inputs = processor(images=images)
image_inputs = image_inputs.to(model.device, model.dtype)
image_outputs = model(**image_inputs)
image_embeddings_torch = image_outputs.embeddings
### Expected behavior
I'd expect float16 and bfloat16 to work.
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40875/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40874
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40874/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40874/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40874/events
|
https://github.com/huggingface/transformers/issues/40874
| 3,414,901,848
|
I_kwDOCUB6oc7Li0RY
| 40,874
|
Missing num_hidden_layers in T5GemmaConfig
|
{
"login": "kuihao",
"id": 56499195,
"node_id": "MDQ6VXNlcjU2NDk5MTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/56499195?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kuihao",
"html_url": "https://github.com/kuihao",
"followers_url": "https://api.github.com/users/kuihao/followers",
"following_url": "https://api.github.com/users/kuihao/following{/other_user}",
"gists_url": "https://api.github.com/users/kuihao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kuihao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kuihao/subscriptions",
"organizations_url": "https://api.github.com/users/kuihao/orgs",
"repos_url": "https://api.github.com/users/kuihao/repos",
"events_url": "https://api.github.com/users/kuihao/events{/privacy}",
"received_events_url": "https://api.github.com/users/kuihao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-14T11:06:21
| 2025-10-01T14:55:56
| 2025-10-01T14:55:56
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.56.0
- Platform: Linux-5.15.0-151-generic-x86_64-with-glibc2.39
- Python version: 3.12.3
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.5.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: 0.17.5
- PyTorch version (accelerator?): 2.6.0a0+ecf3bae40a.nv25.01 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: **No**
- Using GPU in script?: **Yes**
- GPU type: NVIDIA RTX PRO 6000 Blackwell Max-Q Workstation Edition
### Who can help?
Models:
- text models: @ArthurZucker
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Running this code will result in an `AttributeError`.
```python
# pip install accelerate
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
import torch
tokenizer = AutoTokenizer.from_pretrained("google/t5gemma-b-b-ul2")
model = AutoModelForSeq2SeqLM.from_pretrained(
"google/t5gemma-b-b-ul2",
device_map="auto",
)
input_text = "Write me a poem about Machine Learning. Answer:"
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids, max_new_tokens=32)
print(tokenizer.decode(outputs[0]))
```
```bash
Traceback (most recent call last):
File "/data/t5gemma_example.py", line 15, in <module>
outputs = model.generate(**input_ids, max_new_tokens=32)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/generation/utils.py", line 2399, in generate
self._prepare_cache_for_generation(
File "/usr/local/lib/python3.12/dist-packages/transformers/generation/utils.py", line 2007, in _prepare_cache_for_generation
else EncoderDecoderCache(DynamicCache(**dynamic_cache_kwargs), DynamicCache(**dynamic_cache_kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/cache_utils.py", line 1018, in __init__
for _ in range(config.num_hidden_layers)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py", line 207, in __getattribute__
return super().__getattribute__(key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'T5GemmaConfig' object has no attribute 'num_hidden_layers'
```
**Proposed Workaround**
The bug can be temporarily resolved by manually adding the num_hidden_layers attribute to the model's configuration object before generation. The value can be taken from either the encoder or decoder configuration, as they are the same.
```python
# pip install accelerate
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
import torch
tokenizer = AutoTokenizer.from_pretrained("google/t5gemma-b-b-ul2")
model = AutoModelForSeq2SeqLM.from_pretrained(
"google/t5gemma-b-b-ul2",
device_map="auto",
)
# Workaround to resolve the missing attribute
if not hasattr(model.config, "num_hidden_layers"):
model.config.num_hidden_layers = model.config.encoder.num_hidden_layers
input_text = "Write me a poem about Machine Learning. Answer:"
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids, max_new_tokens=32)
print(tokenizer.decode(outputs[0]))
```
### Expected behavior
The `T5GemmaConfig` object should include the `num_hidden_layers` attribute, which is a common and necessary property for many functionalities within the `transformers` library. The model should be able to generate outputs without needing a manual workaround.
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40874/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40874/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40873
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40873/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40873/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40873/events
|
https://github.com/huggingface/transformers/pull/40873
| 3,414,710,903
|
PR_kwDOCUB6oc6od9-8
| 40,873
|
🌐 [i18n-KO] Translated gemma3n.md to Korean
|
{
"login": "HyunZ118",
"id": 156191095,
"node_id": "U_kgDOCU9Jdw",
"avatar_url": "https://avatars.githubusercontent.com/u/156191095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HyunZ118",
"html_url": "https://github.com/HyunZ118",
"followers_url": "https://api.github.com/users/HyunZ118/followers",
"following_url": "https://api.github.com/users/HyunZ118/following{/other_user}",
"gists_url": "https://api.github.com/users/HyunZ118/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HyunZ118/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HyunZ118/subscriptions",
"organizations_url": "https://api.github.com/users/HyunZ118/orgs",
"repos_url": "https://api.github.com/users/HyunZ118/repos",
"events_url": "https://api.github.com/users/HyunZ118/events{/privacy}",
"received_events_url": "https://api.github.com/users/HyunZ118/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-14T07:47:41
| 2025-10-17T16:57:05
| 2025-10-17T16:57:05
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40873",
"html_url": "https://github.com/huggingface/transformers/pull/40873",
"diff_url": "https://github.com/huggingface/transformers/pull/40873.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40873.patch",
"merged_at": "2025-10-17T16:57:05"
}
|
# What does this PR do?
Translated the gemma3n.md file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@jungnerd, @yijun-lee, @Kim-Ju-won, @FacerAin, @judy-choi, @AhnJoonSung, @maximizemaxwell, @nsbg
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40873/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40873/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40872
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40872/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40872/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40872/events
|
https://github.com/huggingface/transformers/pull/40872
| 3,414,530,812
|
PR_kwDOCUB6oc6odY8S
| 40,872
|
Fixed a typo in "transformers/docs/source/en/perf_hardware.md"
|
{
"login": "j-harshana",
"id": 189495155,
"node_id": "U_kgDOC0t3cw",
"avatar_url": "https://avatars.githubusercontent.com/u/189495155?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/j-harshana",
"html_url": "https://github.com/j-harshana",
"followers_url": "https://api.github.com/users/j-harshana/followers",
"following_url": "https://api.github.com/users/j-harshana/following{/other_user}",
"gists_url": "https://api.github.com/users/j-harshana/gists{/gist_id}",
"starred_url": "https://api.github.com/users/j-harshana/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/j-harshana/subscriptions",
"organizations_url": "https://api.github.com/users/j-harshana/orgs",
"repos_url": "https://api.github.com/users/j-harshana/repos",
"events_url": "https://api.github.com/users/j-harshana/events{/privacy}",
"received_events_url": "https://api.github.com/users/j-harshana/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-14T03:59:20
| 2025-09-15T11:58:28
| 2025-09-15T11:58:27
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40872",
"html_url": "https://github.com/huggingface/transformers/pull/40872",
"diff_url": "https://github.com/huggingface/transformers/pull/40872.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40872.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40872/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40872/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40871
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40871/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40871/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40871/events
|
https://github.com/huggingface/transformers/pull/40871
| 3,414,317,259
|
PR_kwDOCUB6oc6ocrQ4
| 40,871
|
Refactor benchmark utils: add type hints, GPU metrics helper, and con…
|
{
"login": "ProblemShooter",
"id": 171776292,
"node_id": "U_kgDOCj0ZJA",
"avatar_url": "https://avatars.githubusercontent.com/u/171776292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ProblemShooter",
"html_url": "https://github.com/ProblemShooter",
"followers_url": "https://api.github.com/users/ProblemShooter/followers",
"following_url": "https://api.github.com/users/ProblemShooter/following{/other_user}",
"gists_url": "https://api.github.com/users/ProblemShooter/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ProblemShooter/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ProblemShooter/subscriptions",
"organizations_url": "https://api.github.com/users/ProblemShooter/orgs",
"repos_url": "https://api.github.com/users/ProblemShooter/repos",
"events_url": "https://api.github.com/users/ProblemShooter/events{/privacy}",
"received_events_url": "https://api.github.com/users/ProblemShooter/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-14T00:04:26
| 2025-09-23T11:48:16
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40871",
"html_url": "https://github.com/huggingface/transformers/pull/40871",
"diff_url": "https://github.com/huggingface/transformers/pull/40871.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40871.patch",
"merged_at": null
}
|
Hi team 👋,
This PR refactors the benchmarking utility code to make it cleaner, more reliable, and easier to maintain. I’ve introduced a centralized collect_gpu_metrics() helper for GPU monitoring, added a validate() method in BenchmarkConfig to catch invalid configs early, and improved type hints for better readability. Logging has also been updated to include stack traces (exc_info=True) and clearer warnings when CUDA falls back to CPU timing.
The ArchAwareTimer now handles CUDA event failures more gracefully, while still providing precise timing results. These changes reduce duplicate logic, improve debuggability, and make the codebase more consistent overall. Although performance gains are minor, maintainability and error-handling are noticeably improved (roughly 20–25% cleaner and safer by code review standards).
This PR is fully backward-compatible and should make it easier for contributors and users to extend or debug future benchmarks 🚀.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40871/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40871/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40870
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40870/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40870/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40870/events
|
https://github.com/huggingface/transformers/pull/40870
| 3,414,290,692
|
PR_kwDOCUB6oc6ocliV
| 40,870
|
Reduce vRAM usage during generation by allowing to transfer logits to CPU
|
{
"login": "SamuelBarryCS",
"id": 127697809,
"node_id": "U_kgDOB5yDkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/127697809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SamuelBarryCS",
"html_url": "https://github.com/SamuelBarryCS",
"followers_url": "https://api.github.com/users/SamuelBarryCS/followers",
"following_url": "https://api.github.com/users/SamuelBarryCS/following{/other_user}",
"gists_url": "https://api.github.com/users/SamuelBarryCS/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SamuelBarryCS/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SamuelBarryCS/subscriptions",
"organizations_url": "https://api.github.com/users/SamuelBarryCS/orgs",
"repos_url": "https://api.github.com/users/SamuelBarryCS/repos",
"events_url": "https://api.github.com/users/SamuelBarryCS/events{/privacy}",
"received_events_url": "https://api.github.com/users/SamuelBarryCS/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-13T23:36:29
| 2025-09-19T10:51:06
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40870",
"html_url": "https://github.com/huggingface/transformers/pull/40870",
"diff_url": "https://github.com/huggingface/transformers/pull/40870.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40870.patch",
"merged_at": null
}
|
## What
- Fixes https://github.com/huggingface/transformers/issues/40794 by adding a parameter offload_logits_to_cpu to GenerationConfig which transfers logits and scores tensors to the CPUs after generation.
- Frees up memory during large runs, trading decreased vRAM usage for CPU/ GPU communication time, enabling potentially higher batch size or seq length during training.
- Adds test `tests.generation.test_utils.test_offload_logits_to_cpu` to test non regression
## How to review
- Read diff
- Check that new tests test_offload_logits_to_cpu is correct and that tests are still passing
## Testing performed
- All existing tests still passing fine
- tests.generation.test_utils.test_offload_logits_to_cpu passing fine as well:
```
(hf) samuel.barry@RNO:slurm-h100-reserved-rno-199-065:~/workspace/transformers(transfer-logits-to-cpu)$ python -m unittest tests.models.gpt2.test_modeling_gpt2.GPT2ModelTest.test_offload_logits_to_cpu -v
[2025-09-14 03:50:08,430] [INFO] [logging.py:107:log_dist] [Rank -1] [TorchCheckpointEngine] Initialized with serialization = False
test_offload_logits_to_cpu (tests.models.gpt2.test_modeling_gpt2.GPT2ModelTest) ... ok
----------------------------------------------------------------------
Ran 1 test in 0.252s
OK
```
## Benchmark
- Developed `memory_test.py` **that will be deleted before merging** to showcase impact
- Results with GPT2-large and `max_new_tokens=1000` : ~50% reduction of additional peak memory usage for <2% time overhead.
```
Testing model: gpt2-large
Tokens to generate: 1000
Device: NVIDIA H100 80GB HBM3
Loading model...
Testing without CPU offloading...
Initial memory: 1512.4 MB
Peak memory: 1916.8 MB
Memory increase: 404.5 MB
Generation time: 10.08s
Tokens generated: 1000
Testing with CPU offloading...
Initial memory: 1544.4 MB
Peak memory: 1725.0 MB
Memory increase: 180.6 MB
Generation time: 9.87s
Tokens generated: 1000
Results:
Number of tokens generated: 1019
Memory saved: 223.9 MB
Memory reduction: 55.4%
Time overhead: -1.8%
Sequences match: True
```
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40870/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40870/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40869
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40869/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40869/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40869/events
|
https://github.com/huggingface/transformers/pull/40869
| 3,414,150,461
|
PR_kwDOCUB6oc6ocHQO
| 40,869
|
Bug: Fix device/dtype mismatch in DetaForObjectDetection bias initialization .
|
{
"login": "Aniketsy",
"id": 148300120,
"node_id": "U_kgDOCNbhWA",
"avatar_url": "https://avatars.githubusercontent.com/u/148300120?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Aniketsy",
"html_url": "https://github.com/Aniketsy",
"followers_url": "https://api.github.com/users/Aniketsy/followers",
"following_url": "https://api.github.com/users/Aniketsy/following{/other_user}",
"gists_url": "https://api.github.com/users/Aniketsy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Aniketsy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Aniketsy/subscriptions",
"organizations_url": "https://api.github.com/users/Aniketsy/orgs",
"repos_url": "https://api.github.com/users/Aniketsy/repos",
"events_url": "https://api.github.com/users/Aniketsy/events{/privacy}",
"received_events_url": "https://api.github.com/users/Aniketsy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T21:13:14
| 2025-09-15T12:12:21
| 2025-09-15T12:12:21
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40869",
"html_url": "https://github.com/huggingface/transformers/pull/40869",
"diff_url": "https://github.com/huggingface/transformers/pull/40869.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40869.patch",
"merged_at": null
}
|
#40853
This PR fixes a bug that prevented DETA object detection models from loading in recent Transformers versions due to a device/dtype mismatch when initializing self.class_embed.bias.data. The fix ensures the tensor is created on the correct device and with the correct dtype, resolving error .
Please let me know if my approach or fix needs any improvements . I’m open to feedback and happy to make changes based on suggestions.
Thankyou !
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40869/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40869/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40868
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40868/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40868/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40868/events
|
https://github.com/huggingface/transformers/issues/40868
| 3,413,768,047
|
I_kwDOCUB6oc7Lefdv
| 40,868
|
Review and update the Code of Conduct
|
{
"login": "wiwdep-netizen",
"id": 227674328,
"node_id": "U_kgDODZII2A",
"avatar_url": "https://avatars.githubusercontent.com/u/227674328?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wiwdep-netizen",
"html_url": "https://github.com/wiwdep-netizen",
"followers_url": "https://api.github.com/users/wiwdep-netizen/followers",
"following_url": "https://api.github.com/users/wiwdep-netizen/following{/other_user}",
"gists_url": "https://api.github.com/users/wiwdep-netizen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wiwdep-netizen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wiwdep-netizen/subscriptions",
"organizations_url": "https://api.github.com/users/wiwdep-netizen/orgs",
"repos_url": "https://api.github.com/users/wiwdep-netizen/repos",
"events_url": "https://api.github.com/users/wiwdep-netizen/events{/privacy}",
"received_events_url": "https://api.github.com/users/wiwdep-netizen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 9258341780,
"node_id": "LA_kwDOCUB6oc8AAAACJ9cVlA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Code%20agent%20slop",
"name": "Code agent slop",
"color": "C59579",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T16:18:52
| 2025-09-15T11:48:51
| 2025-09-15T11:48:51
|
NONE
| null | null | null | null |
This issue tracks the review and potential update of the project's Code of Conduct. The goal is to ensure our code of conduct is clear, comprehensive, and reflects our community’s values.
Sub-issues will address:
- Identifying any gaps in the current Code of Conduct
- Proposing improvements or clarifications
- Implementation of changes
Please provide additional context and information to help move these sub-issues forward.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40868/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40868/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40867
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40867/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40867/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40867/events
|
https://github.com/huggingface/transformers/issues/40867
| 3,413,712,832
|
I_kwDOCUB6oc7LeR_A
| 40,867
|
bug with AutoVideoProcessor for VJEPA 2
|
{
"login": "FrancoisPorcher",
"id": 93766133,
"node_id": "U_kgDOBZbB9Q",
"avatar_url": "https://avatars.githubusercontent.com/u/93766133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FrancoisPorcher",
"html_url": "https://github.com/FrancoisPorcher",
"followers_url": "https://api.github.com/users/FrancoisPorcher/followers",
"following_url": "https://api.github.com/users/FrancoisPorcher/following{/other_user}",
"gists_url": "https://api.github.com/users/FrancoisPorcher/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FrancoisPorcher/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FrancoisPorcher/subscriptions",
"organizations_url": "https://api.github.com/users/FrancoisPorcher/orgs",
"repos_url": "https://api.github.com/users/FrancoisPorcher/repos",
"events_url": "https://api.github.com/users/FrancoisPorcher/events{/privacy}",
"received_events_url": "https://api.github.com/users/FrancoisPorcher/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T15:41:29
| 2025-10-01T08:55:15
| 2025-10-01T08:55:15
|
NONE
| null | null | null | null |
### System Info
### Bug Report
Hi,
The `AutoVideoProcessor` for **VJEPA 2** was working fine, but after upgrading `transformers` to **4.56.1** it stopped working.
I’m not sure if the issue comes from `accelerate` or from the `AutoVideoProcessor` itself.
---
### Code Snippet
```python
video = self.preprocessor(inputs["video"], return_tensors="pt")["pixel_values_videos"]
Traceback (most recent call last):
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/private/home/francoisporcher/FutureLatents/src/main.py", line 175, in <module>
main()
File "/private/home/francoisporcher/FutureLatents/src/main.py", line 161, in main
trainer.fit(
File "/private/home/francoisporcher/FutureLatents/training/trainer.py", line 337, in fit
train_loss = self.train_epoch(train_loader)
File "/private/home/francoisporcher/FutureLatents/training/trainer.py", line 200, in train_epoch
total_loss += self.train_step(batch)
File "/private/home/francoisporcher/FutureLatents/training/trainer.py", line 123, in train_step
outputs = self.model(batch, return_norms=self.debug)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1751, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1762, in _call_impl
return forward_call(*args, **kwargs)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/accelerate/utils/operations.py", line 818, in forward
return model_forward(*args, **kwargs)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/accelerate/utils/operations.py", line 806, in __call__
return convert_to_fp32(self.model_forward(*args, **kwargs))
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/torch/amp/autocast_mode.py", line 44, in decorate_autocast
return func(*args, **kwargs)
File "/private/home/francoisporcher/FutureLatents/models/models.py", line 148, in forward
latents = self.encode_inputs(batch) # [B, D, T, H, W]
File "/private/home/francoisporcher/FutureLatents/models/models.py", line 79, in encode_inputs
video = self.preprocessor(inputs["video"], return_tensors="pt")["pixel_values_videos"]
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/transformers/video_processing_utils.py", line 212, in __call__
return self.preprocess(videos, **kwargs)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/transformers/video_processing_utils.py", line 378, in preprocess
videos, video_metadata = self._decode_and_sample_videos(
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/transformers/video_processing_utils.py", line 302, in _decode_and_sample_videos
videos = make_batched_videos(videos)
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/transformers/video_utils.py", line 199, in make_batched_videos
return [np.array(videos)[None, ...]]
File "/private/home/francoisporcher/.conda/envs/future_latents/lib/python3.10/site-packages/torch/_tensor.py", line 1225, in __array__
return self.numpy()
TypeError: can't convert cuda:0 device type tensor to numpy. Use Tensor.cpu() to copy the tensor to host memory first
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
video = self.preprocessor(inputs["video"], return_tensors="pt")["pixel_values_videos"]
### Expected behavior
not crashing
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40867/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40867/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40866
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40866/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40866/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40866/events
|
https://github.com/huggingface/transformers/pull/40866
| 3,413,299,622
|
PR_kwDOCUB6oc6oZQ8j
| 40,866
|
Updated the model card for TimeSformer
|
{
"login": "mreraser",
"id": 33192762,
"node_id": "MDQ6VXNlcjMzMTkyNzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/33192762?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mreraser",
"html_url": "https://github.com/mreraser",
"followers_url": "https://api.github.com/users/mreraser/followers",
"following_url": "https://api.github.com/users/mreraser/following{/other_user}",
"gists_url": "https://api.github.com/users/mreraser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mreraser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mreraser/subscriptions",
"organizations_url": "https://api.github.com/users/mreraser/orgs",
"repos_url": "https://api.github.com/users/mreraser/repos",
"events_url": "https://api.github.com/users/mreraser/events{/privacy}",
"received_events_url": "https://api.github.com/users/mreraser/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T12:01:13
| 2025-09-18T17:23:52
| 2025-09-18T17:23:52
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40866",
"html_url": "https://github.com/huggingface/transformers/pull/40866",
"diff_url": "https://github.com/huggingface/transformers/pull/40866.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40866.patch",
"merged_at": null
}
|
# What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the [TimeSformer](https://huggingface.co/docs/transformers/main/model_doc/timesformer) model, which will now be aligned with the standardized format for all the docs.
## Check list
- [x] Include a brief description of the model
- [x] Ready to use code examples (`Pipeline` **not** available, `AutoModel` available, and `transformers-cli` **not** available)
- [x] For large models, provide a quantization example
- [ ] Include an attention mask visualizer
## Who can review?
Hello @stevhliu! 👋
I’ve completed the updates to this file.
Could you kindly review the changes and let me know if there’s anything that should be improved or corrected?
I’d be happy to make further adjustments based on your feedback.
Thank you very much for your time and guidance!
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40866/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40866/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40865
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40865/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40865/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40865/events
|
https://github.com/huggingface/transformers/pull/40865
| 3,413,237,731
|
PR_kwDOCUB6oc6oZDJM
| 40,865
|
Updated the model card for ViViT
|
{
"login": "mreraser",
"id": 33192762,
"node_id": "MDQ6VXNlcjMzMTkyNzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/33192762?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mreraser",
"html_url": "https://github.com/mreraser",
"followers_url": "https://api.github.com/users/mreraser/followers",
"following_url": "https://api.github.com/users/mreraser/following{/other_user}",
"gists_url": "https://api.github.com/users/mreraser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mreraser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mreraser/subscriptions",
"organizations_url": "https://api.github.com/users/mreraser/orgs",
"repos_url": "https://api.github.com/users/mreraser/repos",
"events_url": "https://api.github.com/users/mreraser/events{/privacy}",
"received_events_url": "https://api.github.com/users/mreraser/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T11:25:12
| 2025-09-18T17:23:38
| 2025-09-18T17:23:38
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40865",
"html_url": "https://github.com/huggingface/transformers/pull/40865",
"diff_url": "https://github.com/huggingface/transformers/pull/40865.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40865.patch",
"merged_at": null
}
|
# What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the [ViViT](https://huggingface.co/docs/transformers/main/model_doc/vivit) model, which will now be aligned with the standardized format for all the docs.
## Check list
- [x] Include a brief description of the model
- [x] Ready to use code examples (`Pipeline` **not** available, `AutoModel` available, and `transformers-cli` **not** available)
- [x] For large models, provide a quantization example
- [ ] Include an attention mask visualizer
## Who can review?
Hello @stevhliu! 👋
I’ve completed the updates to this file.
Could you kindly review the changes and let me know if there’s anything that should be improved or corrected?
I’d be happy to make further adjustments based on your feedback.
Thank you very much for your time and guidance!
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40865/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40865/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40864
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40864/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40864/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40864/events
|
https://github.com/huggingface/transformers/pull/40864
| 3,413,232,799
|
PR_kwDOCUB6oc6oZCDh
| 40,864
|
remove dummy EncodingFast
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T11:21:48
| 2025-09-16T13:05:13
| 2025-09-16T12:56:11
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40864",
"html_url": "https://github.com/huggingface/transformers/pull/40864",
"diff_url": "https://github.com/huggingface/transformers/pull/40864.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40864.patch",
"merged_at": "2025-09-16T12:56:11"
}
|
# What does this PR do?
Remove the dummy EncodingFast class. It's safer to always use the real EncodingFast.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40864/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40864/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40863
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40863/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40863/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40863/events
|
https://github.com/huggingface/transformers/pull/40863
| 3,412,862,715
|
PR_kwDOCUB6oc6oX80r
| 40,863
|
[VisionEncoderDecoderModel] Update loss function
|
{
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/users/NielsRogge/followers",
"following_url": "https://api.github.com/users/NielsRogge/following{/other_user}",
"gists_url": "https://api.github.com/users/NielsRogge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NielsRogge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NielsRogge/subscriptions",
"organizations_url": "https://api.github.com/users/NielsRogge/orgs",
"repos_url": "https://api.github.com/users/NielsRogge/repos",
"events_url": "https://api.github.com/users/NielsRogge/events{/privacy}",
"received_events_url": "https://api.github.com/users/NielsRogge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T08:12:18
| 2025-10-14T14:03:01
| 2025-10-14T14:03:00
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40863",
"html_url": "https://github.com/huggingface/transformers/pull/40863",
"diff_url": "https://github.com/huggingface/transformers/pull/40863.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40863.patch",
"merged_at": "2025-10-14T14:03:00"
}
|
# What does this PR do?
Models like Donut are currently broken on main, they can't be fine-tuned. In order to unblock users at #39473, this PR reverts #36753.
It looks like the `ForCausalLMLoss` class shifts the labels, however the VisionEncoderDecoderModel class does not expect shifted labels as seen [here](https://github.com/huggingface/transformers/blob/d42e96a2a731c4a772e396baa0d915524c873ff0/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py#L547).
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40863/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40862
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40862/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40862/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40862/events
|
https://github.com/huggingface/transformers/pull/40862
| 3,412,784,458
|
PR_kwDOCUB6oc6oXrlx
| 40,862
|
Redirect MI355 CI results to dummy dataset
|
{
"login": "ahadnagy",
"id": 21314428,
"node_id": "MDQ6VXNlcjIxMzE0NDI4",
"avatar_url": "https://avatars.githubusercontent.com/u/21314428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ahadnagy",
"html_url": "https://github.com/ahadnagy",
"followers_url": "https://api.github.com/users/ahadnagy/followers",
"following_url": "https://api.github.com/users/ahadnagy/following{/other_user}",
"gists_url": "https://api.github.com/users/ahadnagy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ahadnagy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ahadnagy/subscriptions",
"organizations_url": "https://api.github.com/users/ahadnagy/orgs",
"repos_url": "https://api.github.com/users/ahadnagy/repos",
"events_url": "https://api.github.com/users/ahadnagy/events{/privacy}",
"received_events_url": "https://api.github.com/users/ahadnagy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T07:17:13
| 2025-09-14T16:42:50
| 2025-09-14T16:42:50
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40862",
"html_url": "https://github.com/huggingface/transformers/pull/40862",
"diff_url": "https://github.com/huggingface/transformers/pull/40862.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40862.patch",
"merged_at": "2025-09-14T16:42:50"
}
|
# What does this PR do?
This PR will temporarily redirect the MI355 CI results to a dummy dataset until the runners become stable and the isolation of results in the main dataset is sorted out.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40862/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40862/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40861
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40861/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40861/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40861/events
|
https://github.com/huggingface/transformers/pull/40861
| 3,412,667,357
|
PR_kwDOCUB6oc6oXS_F
| 40,861
|
Support n_groups>1 for mamba2
|
{
"login": "tdoublep",
"id": 7945038,
"node_id": "MDQ6VXNlcjc5NDUwMzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/7945038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tdoublep",
"html_url": "https://github.com/tdoublep",
"followers_url": "https://api.github.com/users/tdoublep/followers",
"following_url": "https://api.github.com/users/tdoublep/following{/other_user}",
"gists_url": "https://api.github.com/users/tdoublep/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tdoublep/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tdoublep/subscriptions",
"organizations_url": "https://api.github.com/users/tdoublep/orgs",
"repos_url": "https://api.github.com/users/tdoublep/repos",
"events_url": "https://api.github.com/users/tdoublep/events{/privacy}",
"received_events_url": "https://api.github.com/users/tdoublep/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-13T05:37:17
| 2025-09-15T11:04:07
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40861",
"html_url": "https://github.com/huggingface/transformers/pull/40861",
"diff_url": "https://github.com/huggingface/transformers/pull/40861.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40861.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
This fixes an issue for models like `mistralai/Mamba-Codestral-7B-v0.1` that use `mamba2` architecture with `n_groups>1`. It is equivalent to the fix for Zamba from #35943.
This issue [prevents us](https://github.com/vllm-project/vllm/pull/24638#issuecomment-3285393389) comparing against transformers as baseline in vLLM CI for this model.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@pglorio @vasqu @ArthurZucker @hmellor
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40861/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40861/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40860
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40860/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40860/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40860/events
|
https://github.com/huggingface/transformers/pull/40860
| 3,412,355,199
|
PR_kwDOCUB6oc6oWO8_
| 40,860
|
Use torch.expm1 and torch.log1p for better numerical results
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-13T01:18:09
| 2025-09-15T12:13:39
| 2025-09-15T11:54:14
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40860",
"html_url": "https://github.com/huggingface/transformers/pull/40860",
"diff_url": "https://github.com/huggingface/transformers/pull/40860.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40860.patch",
"merged_at": "2025-09-15T11:54:14"
}
|
# What does this PR do?
Detected by TorchFix
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40860/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40860/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40859
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40859/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40859/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40859/events
|
https://github.com/huggingface/transformers/pull/40859
| 3,412,097,343
|
PR_kwDOCUB6oc6oVWXw
| 40,859
|
🚨 [lightglue] fix: matches order changed because of early stopped indices
|
{
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/sbucaille/followers",
"following_url": "https://api.github.com/users/sbucaille/following{/other_user}",
"gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions",
"organizations_url": "https://api.github.com/users/sbucaille/orgs",
"repos_url": "https://api.github.com/users/sbucaille/repos",
"events_url": "https://api.github.com/users/sbucaille/events{/privacy}",
"received_events_url": "https://api.github.com/users/sbucaille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T22:39:30
| 2025-09-19T16:52:44
| 2025-09-19T15:41:22
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40859",
"html_url": "https://github.com/huggingface/transformers/pull/40859",
"diff_url": "https://github.com/huggingface/transformers/pull/40859.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40859.patch",
"merged_at": "2025-09-19T15:41:22"
}
|
# What does this PR do?
Fixes https://github.com/cvg/LightGlue/issues/171#issuecomment-3284295107
A bug is present in LightGlue when using batching.
The way early stopped indices were handled made order of matches change.
Example :
```python
# Input being
[[image2, image0], [image2, image0], [image1, image1]]
# Considering image2 and image0 very dissimilar and image1 is perfect match to itself
# Output would be similar to
[50, 900, 50]
# instead of
[50, 50, 900]
```
This PR fixes the bug by rearranging the early stopped indices which are used to rearrange the matches and other tensors
## Before submitting
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you write any new necessary tests?
## Who can review?
@qubvel
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40859/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40859/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40858
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40858/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40858/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40858/events
|
https://github.com/huggingface/transformers/issues/40858
| 3,411,882,835
|
I_kwDOCUB6oc7LXTNT
| 40,858
|
torch.no_grad() yields NaN values on mps device, 4D attention mask
|
{
"login": "AmitMY",
"id": 5757359,
"node_id": "MDQ6VXNlcjU3NTczNTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5757359?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AmitMY",
"html_url": "https://github.com/AmitMY",
"followers_url": "https://api.github.com/users/AmitMY/followers",
"following_url": "https://api.github.com/users/AmitMY/following{/other_user}",
"gists_url": "https://api.github.com/users/AmitMY/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AmitMY/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AmitMY/subscriptions",
"organizations_url": "https://api.github.com/users/AmitMY/orgs",
"repos_url": "https://api.github.com/users/AmitMY/repos",
"events_url": "https://api.github.com/users/AmitMY/events{/privacy}",
"received_events_url": "https://api.github.com/users/AmitMY/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
| null |
[] | null |
[] | 2025-09-12T20:40:36
| 2025-10-13T11:29:17
| null |
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.56.0
- Platform: macOS-15.6.1-arm64-arm-64bit
- Python version: 3.12.2
- Huggingface_hub version: 0.34.3
- Safetensors version: 0.5.3
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1 (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
Don't know.
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
on macOS, define a 4D attention mask that is padded, and call a model.
```py
@pytest.mark.parametrize("device", ["cpu", "mps"])
def test_isolated_language_model_mps(device):
model = AutoModelForCausalLM.from_pretrained("sbintuitions/tiny-lm").to(device)
input_ids = torch.tensor([[0, 1, 0, 0], [0, 1, 2, 3]], device=device)
attention_mask = torch.tensor([
[[[ True, False, False, False],
[ True, True, False, False],
[False, False, False, False],
[False, False, False, False]]],
[[[ True, False, False, False],
[ True, True, False, False],
[ True, True, True, False],
[ True, True, True, True]]]], device=device)
# attention_mask = torch.tensor([[1, 1, 0, 0], [1, 1, 1, 1]], dtype=torch.bool, device=device)
with torch.no_grad():
outputs = model(input_ids=input_ids, attention_mask=attention_mask)
assert not torch.isnan(outputs.logits).any(), "Logits contain NaN values"
```
### Expected behavior
not NaN - same as it is on CPU.
Also, if removing `torch.no_grad()` it is also not NaN.
<img width="291" height="75" alt="Image" src="https://github.com/user-attachments/assets/9d99ff28-c3bd-4ca8-8515-29ba4434d9b2" />
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40858/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40858/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40857
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40857/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40857/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40857/events
|
https://github.com/huggingface/transformers/pull/40857
| 3,411,668,330
|
PR_kwDOCUB6oc6oT4bA
| 40,857
|
Token
|
{
"login": "ArkVex",
"id": 159469387,
"node_id": "U_kgDOCYFPSw",
"avatar_url": "https://avatars.githubusercontent.com/u/159469387?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArkVex",
"html_url": "https://github.com/ArkVex",
"followers_url": "https://api.github.com/users/ArkVex/followers",
"following_url": "https://api.github.com/users/ArkVex/following{/other_user}",
"gists_url": "https://api.github.com/users/ArkVex/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArkVex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArkVex/subscriptions",
"organizations_url": "https://api.github.com/users/ArkVex/orgs",
"repos_url": "https://api.github.com/users/ArkVex/repos",
"events_url": "https://api.github.com/users/ArkVex/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArkVex/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-09-12T19:10:26
| 2025-09-19T06:12:33
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40857",
"html_url": "https://github.com/huggingface/transformers/pull/40857",
"diff_url": "https://github.com/huggingface/transformers/pull/40857.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40857.patch",
"merged_at": null
}
|
# What does this PR do?
This PR fixes the calculation of `train_tokens_per_second` when resuming training from a checkpoint. Previously, the metric was calculated using global state, which could result in unrealistically high values after resuming. Now, the timer and token counters are reset when resuming, so the metric reflects only the current training session.
Fixes #40560
## Before submitting
- [x] This PR addresses a bug in the Trainer metrics.
- [x] Discussed in issue #40560.
- [x] No new dependencies.
- [x] No documentation changes required.
- [x] No new tests added, but existing metric logic is covered.
## Who can review?
@zach-huggingface @ArthurZucker ,
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40857/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40857/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40856
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40856/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40856/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40856/events
|
https://github.com/huggingface/transformers/pull/40856
| 3,411,615,799
|
PR_kwDOCUB6oc6oTsup
| 40,856
|
🔴Make `center_crop` fast equivalent to slow
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T18:52:48
| 2025-09-16T16:01:39
| 2025-09-16T16:01:39
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40856",
"html_url": "https://github.com/huggingface/transformers/pull/40856",
"diff_url": "https://github.com/huggingface/transformers/pull/40856.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40856.patch",
"merged_at": "2025-09-16T16:01:39"
}
|
# What does this PR do?
Use a custom `center_crop` function to be equivalent to the one used in slow processors.
The only difference with torchvision one is that instead of using `int(round(..))` to define `crop_top` and `crop_left`, which round towards the even number, we just use `int(...)` to always round down.
Thanks @rootonchair for adding this first to bridgetower image processor!
Slightly breaking as it will change current results from fast image processors using `center_crop`
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40856/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/huggingface/transformers/issues/40856/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40855
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40855/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40855/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40855/events
|
https://github.com/huggingface/transformers/pull/40855
| 3,411,571,897
|
PR_kwDOCUB6oc6oTjUM
| 40,855
|
[`VaultGemma`] Update expectations in integration tests
|
{
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T18:37:29
| 2025-09-15T10:46:32
| 2025-09-15T10:46:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40855",
"html_url": "https://github.com/huggingface/transformers/pull/40855",
"diff_url": "https://github.com/huggingface/transformers/pull/40855.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40855.patch",
"merged_at": "2025-09-15T10:46:30"
}
|
As per title
cc @Cyrilvallez @ArthurZucker
|
{
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40855/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40855/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40854
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40854/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40854/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40854/events
|
https://github.com/huggingface/transformers/pull/40854
| 3,410,976,333
|
PR_kwDOCUB6oc6oRdYJ
| 40,854
|
[tests] move generative tests away from `test_modeling_common.py`
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T15:41:13
| 2025-09-12T16:15:16
| 2025-09-12T16:12:28
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40854",
"html_url": "https://github.com/huggingface/transformers/pull/40854",
"diff_url": "https://github.com/huggingface/transformers/pull/40854.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40854.patch",
"merged_at": "2025-09-12T16:12:27"
}
|
# What does this PR do?
skips 🔫 (104k -> 101k tests in `tests/models`)
Moves generative tests away from `test_modeling_common.py`, which should be reserved for generalist tests.
TL;DR:
- a test loops over `self.all_generative_model_classes` -> moved to `GenerationTesterMixin`
- a test relies on `AutoModelForCausalLM` -> moved to `CausalLMModelTester`
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40854/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40854/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40853
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40853/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40853/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40853/events
|
https://github.com/huggingface/transformers/issues/40853
| 3,410,975,603
|
I_kwDOCUB6oc7LT1tz
| 40,853
|
Top Three Models on Object Detection Leaderboard Won't Load on Newest Version
|
{
"login": "waylonflinn",
"id": 804108,
"node_id": "MDQ6VXNlcjgwNDEwOA==",
"avatar_url": "https://avatars.githubusercontent.com/u/804108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/waylonflinn",
"html_url": "https://github.com/waylonflinn",
"followers_url": "https://api.github.com/users/waylonflinn/followers",
"following_url": "https://api.github.com/users/waylonflinn/following{/other_user}",
"gists_url": "https://api.github.com/users/waylonflinn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/waylonflinn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/waylonflinn/subscriptions",
"organizations_url": "https://api.github.com/users/waylonflinn/orgs",
"repos_url": "https://api.github.com/users/waylonflinn/repos",
"events_url": "https://api.github.com/users/waylonflinn/events{/privacy}",
"received_events_url": "https://api.github.com/users/waylonflinn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First\" issues - give it a try if you want!"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T15:41:02
| 2025-09-15T15:59:44
| 2025-09-15T15:23:14
|
NONE
| null | null | null | null |
### System Info
The top three models on the Object Detection leaderboard won't load in the latest version.
"jozhang97/deta-swin-large"
"jozhang97/deta-resnet-50-24-epochs"
"jozhang97/deta-resnet-50"
@Cyrilvallez
Versions 4.27.1, 4.49.0 and 4.50.3 are confirmed to work.
Version 4.51.0 is broken and gives the following error.
```python
File ~/<redacted>/venv-deta-hf/lib/python3.10/site-packages/transformers/modeling_utils.py:3736, in PreTrainedModel.get_init_context(cls, is_quantized, _is_ds_init_called)
3734 init_contexts.append(set_quantized_state())
3735 else:
-> 3736 init_contexts = [no_init_weights(), init_empty_weights()]
3738 return init_contexts
NameError: name 'init_empty_weights' is not defined
```
Versions 4.51.1, 4.51.3, 4.52.4, 4.53.3, 4.56.1 are confirmed broken and give the following error.
```python
File ~/<redacted>/venv-deta-hf/lib/python3.10/site-packages/transformers/models/deprecated/deta/modeling_deta.py:1859, in DetaForObjectDetection.__init__(self, config)
1857 prior_prob = 0.01
1858 bias_value = -math.log((1 - prior_prob) / prior_prob)
-> 1859 self.class_embed.bias.data = torch.ones(config.num_labels) * bias_value
1860 nn.init.constant_(self.bbox_embed.layers[-1].weight.data, 0)
1861 nn.init.constant_(self.bbox_embed.layers[-1].bias.data, 0)
RuntimeError: Attempted to call `variable.set_data(tensor)`, but `variable` and `tensor` have incompatible tensor type.
```
Minimal example to reproduce:
```python
from transformers import DetaForObjectDetection
model = DetaForObjectDetection.from_pretrained("jozhang97/deta-swin-large")
```
Expected result:
Model loads normally
Actual result:
Model fails to load giving the error above.
This may be the commit that caused the regression:
https://github.com/huggingface/transformers/commit/08f36771b33d246986d9338a729fc4ef258b999d
### Who can help?
@Cyrilvallez
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
from transformers import DetaForObjectDetection
model = DetaForObjectDetection.from_pretrained("jozhang97/deta-swin-large")
```
### Expected behavior
Model loads normally
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40853/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40853/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40852
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40852/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40852/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40852/events
|
https://github.com/huggingface/transformers/pull/40852
| 3,410,791,535
|
PR_kwDOCUB6oc6oQ04v
| 40,852
|
[test] Fix test_eager_matches_sdpa incorrectly skipped
|
{
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T14:45:42
| 2025-09-12T16:13:31
| 2025-09-12T16:07:48
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40852",
"html_url": "https://github.com/huggingface/transformers/pull/40852",
"diff_url": "https://github.com/huggingface/transformers/pull/40852.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40852.patch",
"merged_at": "2025-09-12T16:07:48"
}
|
# What does this PR do?
After the introduction of `TransformersKwargs`, test_eager_matches_sdpa is incorrectly in the output_attentions case because the current `"output_attentions" in inspect.signature(model_sdpa.forward).parameters` is not enough to find `"output_attentions"` in typed kwargs.
Moreover, it seems to me that `GenericForTokenClassification` should also benefit from `TransformersKwargs` kwargs typing.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40852/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40851
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40851/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40851/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40851/events
|
https://github.com/huggingface/transformers/pull/40851
| 3,410,754,020
|
PR_kwDOCUB6oc6oQsqH
| 40,851
|
add: differential privacy research model
|
{
"login": "RyanMullins",
"id": 868555,
"node_id": "MDQ6VXNlcjg2ODU1NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/868555?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RyanMullins",
"html_url": "https://github.com/RyanMullins",
"followers_url": "https://api.github.com/users/RyanMullins/followers",
"following_url": "https://api.github.com/users/RyanMullins/following{/other_user}",
"gists_url": "https://api.github.com/users/RyanMullins/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RyanMullins/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RyanMullins/subscriptions",
"organizations_url": "https://api.github.com/users/RyanMullins/orgs",
"repos_url": "https://api.github.com/users/RyanMullins/repos",
"events_url": "https://api.github.com/users/RyanMullins/events{/privacy}",
"received_events_url": "https://api.github.com/users/RyanMullins/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T14:34:38
| 2025-09-12T15:36:04
| 2025-09-12T15:36:04
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40851",
"html_url": "https://github.com/huggingface/transformers/pull/40851",
"diff_url": "https://github.com/huggingface/transformers/pull/40851.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40851.patch",
"merged_at": "2025-09-12T15:36:04"
}
|
# What does this PR do?
This PR adds VaultGemma, an LLM trained with sequence-level differential privacy (DP).
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@Cyrilvallez @ArthurZucker @vasqu
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40851/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40851/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40850
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40850/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40850/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40850/events
|
https://github.com/huggingface/transformers/pull/40850
| 3,410,619,954
|
PR_kwDOCUB6oc6oQPD6
| 40,850
|
Fix loading logic flaw with regards to unexpected and missing keys
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T13:57:58
| 2025-09-24T14:44:44
| 2025-09-24T14:44:43
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40850",
"html_url": "https://github.com/huggingface/transformers/pull/40850",
"diff_url": "https://github.com/huggingface/transformers/pull/40850.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40850.patch",
"merged_at": "2025-09-24T14:44:42"
}
|
As per tests, currently there were some paths failing when loading a checkpoint with unrelated weights (which are well defined in the "unexpected weights on load"). This makes sure that this flag is respected and to ignore such weights.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40850/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40849
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40849/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40849/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40849/events
|
https://github.com/huggingface/transformers/issues/40849
| 3,410,506,076
|
I_kwDOCUB6oc7LSDFc
| 40,849
|
The chat prompt template for google/gemma-3-270m-it omits the system message.
|
{
"login": "umang-0801",
"id": 178144901,
"node_id": "U_kgDOCp5GhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/178144901?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/umang-0801",
"html_url": "https://github.com/umang-0801",
"followers_url": "https://api.github.com/users/umang-0801/followers",
"following_url": "https://api.github.com/users/umang-0801/following{/other_user}",
"gists_url": "https://api.github.com/users/umang-0801/gists{/gist_id}",
"starred_url": "https://api.github.com/users/umang-0801/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/umang-0801/subscriptions",
"organizations_url": "https://api.github.com/users/umang-0801/orgs",
"repos_url": "https://api.github.com/users/umang-0801/repos",
"events_url": "https://api.github.com/users/umang-0801/events{/privacy}",
"received_events_url": "https://api.github.com/users/umang-0801/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
| null |
[] | null |
[] | 2025-09-12T13:24:13
| 2025-10-15T00:42:10
| null |
NONE
| null | null | null | null |
### System Info
Objective: To use [`google/gemma-3-270m-it`](https://huggingface.co/google/gemma-3-270m-it) for a chat application.
Problem Description: The tokenizer provides a `apply_chat_template` function which
1. Fails to include the system message in the prompt template with the right tag.
To reproduce this issue:
**Environment details:**
- `transformers` version: 4.56.1
- Platform: Linux-6.1.123+-x86_64-with-glibc2.35
- Python version: 3.12.11
- Huggingface_hub version: 0.34.4
- Safetensors version: 0.6.2
- Accelerate version: 1.10.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.8.0+cu126 (NA)
- Tensorflow version (GPU?): 2.19.0 (False)
- Flax version (CPU?/GPU?/TPU?): 0.10.6 (cpu)
- Jax version: 0.5.3
- JaxLib version: 0.5.3
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, device_map="auto", dtype=torch.bfloat16)
messages = [
{"role": "system", "content": "You are a friendly chatbot who always responds in the style of a pirate"},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"}
]
print(tokenizer.apply_chat_template(messages, tokenize=False))
```
Output
```
<bos><start_of_turn>user
You are a friendly chatbot who always responds in the style of a pirate
How many helicopters can a human eat in one sitting?<end_of_turn>
```
### Expected behavior
Output
```
<start_of_turn>system
You are a friendly chatbot who always responds in the style of a pirate<end_of_turn>
<start_of_turn>user
How many helicopters can a human eat in one sitting?<end_of_turn>
```
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40849/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/huggingface/transformers/issues/40849/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/40848
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40848/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40848/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40848/events
|
https://github.com/huggingface/transformers/pull/40848
| 3,410,424,390
|
PR_kwDOCUB6oc6oPjwJ
| 40,848
|
[Qwen3 Next] Use numerically stable `rsqrt`
|
{
"login": "thalahors",
"id": 178652170,
"node_id": "U_kgDOCqYECg",
"avatar_url": "https://avatars.githubusercontent.com/u/178652170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thalahors",
"html_url": "https://github.com/thalahors",
"followers_url": "https://api.github.com/users/thalahors/followers",
"following_url": "https://api.github.com/users/thalahors/following{/other_user}",
"gists_url": "https://api.github.com/users/thalahors/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thalahors/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thalahors/subscriptions",
"organizations_url": "https://api.github.com/users/thalahors/orgs",
"repos_url": "https://api.github.com/users/thalahors/repos",
"events_url": "https://api.github.com/users/thalahors/events{/privacy}",
"received_events_url": "https://api.github.com/users/thalahors/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T13:05:26
| 2025-09-15T13:06:09
| 2025-09-15T10:45:14
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40848",
"html_url": "https://github.com/huggingface/transformers/pull/40848",
"diff_url": "https://github.com/huggingface/transformers/pull/40848.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40848.patch",
"merged_at": "2025-09-15T10:45:14"
}
|
# What does this PR do?
Uses numerically stable `rsqrt`
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
## Who can review?
@bozheng-hit @Cyrilvallez
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40848/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40848/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40847
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40847/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40847/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40847/events
|
https://github.com/huggingface/transformers/pull/40847
| 3,410,415,851
|
PR_kwDOCUB6oc6oPh2C
| 40,847
|
[config] accept non-full dtypes
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T13:03:44
| 2025-09-12T13:07:55
| 2025-09-12T13:07:45
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40847",
"html_url": "https://github.com/huggingface/transformers/pull/40847",
"diff_url": "https://github.com/huggingface/transformers/pull/40847.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40847.patch",
"merged_at": null
}
|
# What does this PR do?
Fixes the error following script:
(root cause: the existing logic was expecting things like `torch_dtype="torch.float32"`, as opposed to `torch_dtype="float32"`. With this PR, we now accept both)
```py
from transformers import AutoConfig
config = AutoConfig.from_pretrained("BAAI/Emu3-Chat-hf")
```
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40847/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40847/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40846
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40846/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40846/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40846/events
|
https://github.com/huggingface/transformers/pull/40846
| 3,410,112,269
|
PR_kwDOCUB6oc6oOf4G
| 40,846
|
[tests] re-enable aria fast tests
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T11:34:17
| 2025-09-12T16:43:58
| 2025-09-12T14:14:54
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40846",
"html_url": "https://github.com/huggingface/transformers/pull/40846",
"diff_url": "https://github.com/huggingface/transformers/pull/40846.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40846.patch",
"merged_at": "2025-09-12T14:14:54"
}
|
# What does this PR do?
Aria tests were decorated with `@slow` in #38615 because they were slow. The tests were slow because the test model was quite large.
This PR:
- reduces the size of the test aria model and removes `@slow` (non-slow test runtime reduced by 66% on my machine, ~1min -> ~20 secs)
- fixes things that broke on aria tests while they were disabled
- deletes skips that we no longer need 🔫
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40846/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40845
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40845/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40845/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40845/events
|
https://github.com/huggingface/transformers/pull/40845
| 3,410,085,005
|
PR_kwDOCUB6oc6oOaCl
| 40,845
|
Fix typoes in src and tests
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T11:24:42
| 2025-09-19T13:22:02
| 2025-09-19T13:18:38
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40845",
"html_url": "https://github.com/huggingface/transformers/pull/40845",
"diff_url": "https://github.com/huggingface/transformers/pull/40845.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40845.patch",
"merged_at": "2025-09-19T13:18:38"
}
|
# What does this PR do?
Fix more typos
## Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40845/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40844
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40844/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40844/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40844/events
|
https://github.com/huggingface/transformers/pull/40844
| 3,409,911,797
|
PR_kwDOCUB6oc6oN0D7
| 40,844
|
Use checkpoint in auto_class_docstring
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T10:26:10
| 2025-09-13T00:52:46
| 2025-09-13T00:49:19
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40844",
"html_url": "https://github.com/huggingface/transformers/pull/40844",
"diff_url": "https://github.com/huggingface/transformers/pull/40844.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40844.patch",
"merged_at": "2025-09-13T00:49:19"
}
|
# What does this PR do?
Using checkpoint in auto_class_docstring.
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40844/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40843
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40843/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40843/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40843/events
|
https://github.com/huggingface/transformers/pull/40843
| 3,409,772,966
|
PR_kwDOCUB6oc6oNVIx
| 40,843
|
Add VideoProcessors to auto-backend requirements
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T09:51:09
| 2025-09-12T10:21:14
| 2025-09-12T10:21:12
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40843",
"html_url": "https://github.com/huggingface/transformers/pull/40843",
"diff_url": "https://github.com/huggingface/transformers/pull/40843.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40843.patch",
"merged_at": "2025-09-12T10:21:12"
}
|
# What does this PR do?
As per the title. They have the same requirements as fast image processors
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40843/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40843/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40842
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40842/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40842/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40842/events
|
https://github.com/huggingface/transformers/pull/40842
| 3,409,590,841
|
PR_kwDOCUB6oc6oMspB
| 40,842
|
Fix the misalignment between the l2norm in GDN of Qwen3-Next and the implementation in the FLA library.
|
{
"login": "bozheng-hit",
"id": 8787969,
"node_id": "MDQ6VXNlcjg3ODc5Njk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8787969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bozheng-hit",
"html_url": "https://github.com/bozheng-hit",
"followers_url": "https://api.github.com/users/bozheng-hit/followers",
"following_url": "https://api.github.com/users/bozheng-hit/following{/other_user}",
"gists_url": "https://api.github.com/users/bozheng-hit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bozheng-hit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bozheng-hit/subscriptions",
"organizations_url": "https://api.github.com/users/bozheng-hit/orgs",
"repos_url": "https://api.github.com/users/bozheng-hit/repos",
"events_url": "https://api.github.com/users/bozheng-hit/events{/privacy}",
"received_events_url": "https://api.github.com/users/bozheng-hit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-09-12T09:08:35
| 2025-09-12T12:08:01
| 2025-09-12T12:08:01
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/40842",
"html_url": "https://github.com/huggingface/transformers/pull/40842",
"diff_url": "https://github.com/huggingface/transformers/pull/40842.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/40842.patch",
"merged_at": "2025-09-12T12:08:01"
}
|
Fix the misalignment between the l2norm in GDN of Qwen3-Next and the implementation in the FLA library.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40842/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/40841
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/40841/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/40841/comments
|
https://api.github.com/repos/huggingface/transformers/issues/40841/events
|
https://github.com/huggingface/transformers/issues/40841
| 3,409,482,415
|
I_kwDOCUB6oc7LOJKv
| 40,841
|
Add Support for Ovis2.5 Multi-Modal Model
|
{
"login": "xschen-beb",
"id": 61721839,
"node_id": "MDQ6VXNlcjYxNzIxODM5",
"avatar_url": "https://avatars.githubusercontent.com/u/61721839?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xschen-beb",
"html_url": "https://github.com/xschen-beb",
"followers_url": "https://api.github.com/users/xschen-beb/followers",
"following_url": "https://api.github.com/users/xschen-beb/following{/other_user}",
"gists_url": "https://api.github.com/users/xschen-beb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xschen-beb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xschen-beb/subscriptions",
"organizations_url": "https://api.github.com/users/xschen-beb/orgs",
"repos_url": "https://api.github.com/users/xschen-beb/repos",
"events_url": "https://api.github.com/users/xschen-beb/events{/privacy}",
"received_events_url": "https://api.github.com/users/xschen-beb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null |
[] | 2025-09-12T08:41:30
| 2025-09-16T11:18:58
| null |
NONE
| null | null | null | null |
### Model description
Key Features:
Small Model Performance: Optimized training strategies enable small-scale models to achieve higher capability density, demonstrating cross-tier leading advantages.
Enhanced Reasoning Capabilities: Significantly strengthens Chain-of-Thought (CoT) reasoning abilities through the combination of instruction tuning and preference learning.
Video and Multi-Image Processing: Video and multi-image data are incorporated into training to enhance the ability to handle complex visual information across frames and images.
Multilingual Support and OCR: Enhances multilingual OCR beyond English and Chinese and improves structured data extraction from complex visual elements like tables and charts.
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
[1] Arxiv: https://arxiv.org/abs/2508.11737
[2] Huggingface: https://huggingface.co/AIDC-AI/Ovis2-8B
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/40841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/40841/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.