url
string
repository_url
string
labels_url
string
comments_url
string
events_url
string
html_url
string
id
int64
node_id
string
number
int64
title
string
user
dict
labels
list
state
string
locked
bool
assignee
dict
assignees
list
milestone
null
comments
list
created_at
timestamp[ms]
updated_at
timestamp[ms]
closed_at
timestamp[ms]
author_association
string
type
dict
active_lock_reason
null
draft
bool
pull_request
dict
body
string
closed_by
dict
reactions
dict
timeline_url
string
performed_via_github_app
null
state_reason
string
sub_issues_summary
dict
issue_dependencies_summary
dict
is_pull_request
bool
is_closed
bool
https://api.github.com/repos/huggingface/transformers/issues/39437
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39437/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39437/comments
https://api.github.com/repos/huggingface/transformers/issues/39437/events
https://github.com/huggingface/transformers/issues/39437
3,233,263,529
I_kwDOCUB6oc7At6-p
39,437
Support for per-token latency tracking in `generate()` (suggested options: using callback, profiler class, or using a config flag)
{ "login": "spsagar13", "id": 25451111, "node_id": "MDQ6VXNlcjI1NDUxMTEx", "avatar_url": "https://avatars.githubusercontent.com/u/25451111?v=4", "gravatar_id": "", "url": "https://api.github.com/users/spsagar13", "html_url": "https://github.com/spsagar13", "followers_url": "https://api.github.com/users/spsagar13/followers", "following_url": "https://api.github.com/users/spsagar13/following{/other_user}", "gists_url": "https://api.github.com/users/spsagar13/gists{/gist_id}", "starred_url": "https://api.github.com/users/spsagar13/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/spsagar13/subscriptions", "organizations_url": "https://api.github.com/users/spsagar13/orgs", "repos_url": "https://api.github.com/users/spsagar13/repos", "events_url": "https://api.github.com/users/spsagar13/events{/privacy}", "received_events_url": "https://api.github.com/users/spsagar13/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-07-15T18:15:51
2025-09-23T17:18:11
null
NONE
null
null
null
null
### Feature request **Summary** I would like to propose a feature to enable **per-token latency tracking** during generation in Hugging Face’s `generate()` (via _beam_search, sample etc) loop. This is extremely useful for benchmarking models across hardware (e.g. CPU vs GPU, Arm vs x86), identifying bottlenecks, and understanding real-world inference performance at a granular level. I’m suggesting **three high level designs** to add this feature support. **Option 1: `per_token_latency_callback` (minimal, flexible)** _This approach offers maximum flexibility with minimal intrusion - meaning - no config changes, just a simple hook for custom tracking._ By adding a new argument to `generate()`: ```python def generate(..., per_token_latency_callback: Optional[Callable[[int, float], None]] = None): for token_idx in range(max_length): start = time.perf_counter() # generation step end = time.perf_counter() if per_token_latency_callback: per_token_latency_callback(token_idx, end - start) ``` Example Usage (for benchmarking) ```python # Simple function to print per-token latency. # Can be extended to log, store, or analyze latencies as needed def log_token_latency(token_idx, latency): print(f"Token {token_idx}: {latency * 1000:.2f} ms") output = model.generate( input_ids, max_new_tokens=5, per_token_latency_callback=log_token_latency # Usage with new proposed argument ) ``` This would allow users to easily plug in their own logic to log, store, or analyze latency **without touching the internal logic.** **Option 2:` Profiler Class` along with `per_token_latency_callback` (structured, user-friendly)** _This approach introduces a clean, reusable structure that can be extended to track other metrics like memory or FLOPs - ideal for advanced profiling_ Passing a reusable profiler object, for example: ```python class TokenLatencyProfiler: def __init__(self): self.token_latencies = [] def track(self, token_idx, latency): self.token_latencies.append(latency) profiler = TokenLatencyProfiler() output = model.generate(input_ids, per_token_latency_callback=profiler.track) print(profiler.token_latencies) ``` **This structure makes it easy to support other metrics in the future** (e.g. memory, FLOPs, peak RAM/GPU usage). **Option 3: `track_token_latency=True` in GenerationConfig (inspired by Intel's IPEX)** _This approach adds convenience for quick profiling but introduces some coupling between the generation config and the return object structure._ As a higher-level alternative - conceptually inspired by Intel’s IPEX benchmarking approach - this method introduces a dedicated configuration flag within GenerationConfig to enable built-in per-token latency tracking. _Unlike IPEX, which sets this behavior via setattr() on the model config at runtime, this proposal integrates the flag directly into GenerationConfig_ - Update GenerationConfig > ```python > track_token_latency: bool = field( > default=False, > metadata={"help": "If True, enables per-token latency tracking during generation."} > ) > ``` - Use the flag internally > ```python > def generate(self, input_ids, generation_config=None, max_length=20, ...): > generation_config = generation_config or self.generation_config > token_latencies = [] if generation_config.track_token_latency else None > > for token_idx in range(max_length): > start = time.perf_counter() > # generation step (e.g., logits -> sampling -> input_ids update) > end = time.perf_counter() > > if token_latencies is not None: > token_latencies.append(end - start) > > output = GenerationOutput(...) # or SampleDecoderOnlyOutput, etc. > if token_latencies is not None: > output.token_latencies = token_latencies > > return output > ``` - Example Usage > ```python > from transformers import GenerationConfig > > gen_config = GenerationConfig(track_token_latency=True) > output = model.generate(input_ids, generation_config=gen_config) > > print(output.token_latencies) # → [0.012, 0.010, 0.011, ...] > ``` ### Motivation As a performance engineer working with LLMs on hardware like Arm and x86 CPUs, I often need to measure **token-by-token latency** during generation (e.g., to analyze startup cost, cache reuse, SMT impact, or I/O stalls). However: - There is currently **no easy way** to track **per-token latency** via Hugging Face’s `generate()` API. - The existing `output.scores` provides logits but **not timing or performance hooks**. - I discovered that Intel’s IPEX benchmark internally **tracks token latencies**, which was helpful for comparison - but it’s not easily accessible or extendable outside Intel’s stack. - So I had to patch generate logic locally to do something like this: ```python # inside _beam_search under transformers/src/transformers/generation/utils.py for token_idx in range(max_length): start = time.perf_counter() # generation end = time.perf_counter() latencies.append(end - start) ``` _But this (my patch) could break with updates to the Transformers library or models that override generate() with custom loops._ Having official support would benefit researchers, profiling tools, downstream libraries, and anyone benchmarking LLMs on novel hardware. This also helps with: - Debugging regressions in low-level compute kernels (e.g., matmul, softmax) that impact token generation performance - Optimizing latency-critical use cases (e.g., serverless inference, streaming chat) by isolating startup cost, monitoring per-token response time, and diagnosing real-time slowdowns - Analyzing time-to-first-token vs. steady-state generation performance ### Your contribution Yes, I’m happy to submit a PR. I can start by adding **Option 1** (`per_token_latency_callback`) as a lightweight, non-breaking change to `GenerationMixin`. Optionally, I can follow up with Option 2 (profiler class) and/or Option 3 (`track_token_latency=True` in `GenerationConfig`) depending on maintainers’ preferences and/or with additional recommendations/improvements. Please let me know if these changes are fine and if so which direction is preferred, and I’ll start implementing.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39437/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39437/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39436
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39436/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39436/comments
https://api.github.com/repos/huggingface/transformers/issues/39436/events
https://github.com/huggingface/transformers/issues/39436
3,233,218,184
I_kwDOCUB6oc7Atv6I
39,436
Export LFM2 to ExecuTorch
{ "login": "guangy10", "id": 42389959, "node_id": "MDQ6VXNlcjQyMzg5OTU5", "avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4", "gravatar_id": "", "url": "https://api.github.com/users/guangy10", "html_url": "https://github.com/guangy10", "followers_url": "https://api.github.com/users/guangy10/followers", "following_url": "https://api.github.com/users/guangy10/following{/other_user}", "gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}", "starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/guangy10/subscriptions", "organizations_url": "https://api.github.com/users/guangy10/orgs", "repos_url": "https://api.github.com/users/guangy10/repos", "events_url": "https://api.github.com/users/guangy10/events{/privacy}", "received_events_url": "https://api.github.com/users/guangy10/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-15T18:02:09
2025-08-24T08:03:09
2025-08-24T08:03:09
CONTRIBUTOR
null
null
null
null
### System Info - `transformers` version: 4.54.0.dev0 - Platform: macOS-15.5-arm64-arm-64bit - Python version: 3.11.11 - Huggingface_hub version: 0.30.2 - Safetensors version: 0.5.3 - Accelerate version: 1.6.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.9.0.dev20250706 (NA) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: <fill in> ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction I tried out LiquidAI/LFM2-350M in optimum-executorch quickly (with latest `transformers` installed from trunk), by running: ``` optimum-cli export executorch --model LiquidAI/LFM2-350M --task text-generation --recipe xnnpack --use_custom_sdpa --use_custom_kv_cache --qlinear --qembedding --output_dir lfm2 ``` The model failed to export due to some data-dependent flow in the slow_forward: ``` E File "/Users/guangyang/transformers/src/transformers/models/lfm2/modeling_lfm2.py", line 485, in forward E return self.slow_forward(hidden_states, past_key_value, cache_position, attention_mask) E File "/Users/guangyang/transformers/src/transformers/models/lfm2/modeling_lfm2.py", line 453, in slow_forward E if past_key_value is not None and cache_position[0] > 0: ``` which needs to be rewritten in order to make it exportable to ExecuTorch. You can reproduce this export issue without opitmum-executorch by just hacking the `test_static_cache_exportability` in tests/utils/test_cache_utils.py. Simply replace the model_id to "LiquidAI/LFM2-350M", then run the test: ``` RUN_SLOW=1 pytest tests/utils/test_cache_utils.py -vvv -s -k test_static_cache_exportability ``` ### Expected behavior Expect to export the model by rewriting the data-dependent flow
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39436/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39436/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39435
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39435/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39435/comments
https://api.github.com/repos/huggingface/transformers/issues/39435/events
https://github.com/huggingface/transformers/pull/39435
3,233,159,998
PR_kwDOCUB6oc6fCY01
39,435
Add a unit test for BartModel to compare eager, sdpa on one particular set of inputs
{ "login": "xadupre", "id": 22452781, "node_id": "MDQ6VXNlcjIyNDUyNzgx", "avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xadupre", "html_url": "https://github.com/xadupre", "followers_url": "https://api.github.com/users/xadupre/followers", "following_url": "https://api.github.com/users/xadupre/following{/other_user}", "gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}", "starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xadupre/subscriptions", "organizations_url": "https://api.github.com/users/xadupre/orgs", "repos_url": "https://api.github.com/users/xadupre/repos", "events_url": "https://api.github.com/users/xadupre/events{/privacy}", "received_events_url": "https://api.github.com/users/xadupre/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-07-15T17:46:01
2025-08-07T10:35:32
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39435", "html_url": "https://github.com/huggingface/transformers/pull/39435", "diff_url": "https://github.com/huggingface/transformers/pull/39435.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39435.patch", "merged_at": null }
# What does this PR do? Fixes #39365. Not a fix yet but introducing a unit test failing for the reason explained in this issue. Either the inputs are wrong, either the fix from issue #39365 is needed. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [X] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39435/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39435/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39434
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39434/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39434/comments
https://api.github.com/repos/huggingface/transformers/issues/39434/events
https://github.com/huggingface/transformers/pull/39434
3,233,070,952
PR_kwDOCUB6oc6fCE19
39,434
[serve] Add speech to text (`/v1/audio/transcriptions`)
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T17:22:00
2025-07-24T09:41:08
2025-07-17T14:29:58
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39434", "html_url": "https://github.com/huggingface/transformers/pull/39434", "diff_url": "https://github.com/huggingface/transformers/pull/39434.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39434.patch", "merged_at": "2025-07-17T14:29:58" }
# What does this PR do? This PR: - Adds `/v1/audio/transcriptions`, the speech to text (transcription) OAI API - Adds `TimedModel`: models get deleted from memory after a certain amount of time (e.g try launching the server as `transformers serve --enable-cors --log-level debug --model-timeout 10`)
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39434/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39434/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39433
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39433/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39433/comments
https://api.github.com/repos/huggingface/transformers/issues/39433/events
https://github.com/huggingface/transformers/pull/39433
3,232,925,630
PR_kwDOCUB6oc6fBknr
39,433
Fix placeholders replacement logic in auto_docstring
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T16:33:49
2025-07-18T22:56:23
2025-07-18T22:56:23
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39433", "html_url": "https://github.com/huggingface/transformers/pull/39433", "diff_url": "https://github.com/huggingface/transformers/pull/39433.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39433.patch", "merged_at": "2025-07-18T22:56:23" }
# What does this PR do? The placeholders logic in auto_docstring seems to be broken currently, so that placeholders like {image_processor_class} are not replaced properly. This PR simplifies greatly the logic and fixes the issue
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39433/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39433/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39432
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39432/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39432/comments
https://api.github.com/repos/huggingface/transformers/issues/39432/events
https://github.com/huggingface/transformers/pull/39432
3,232,911,449
PR_kwDOCUB6oc6fBhgY
39,432
Fix typo in generation configuration for Janus model weight conversion
{ "login": "thisisiron", "id": 23303033, "node_id": "MDQ6VXNlcjIzMzAzMDMz", "avatar_url": "https://avatars.githubusercontent.com/u/23303033?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thisisiron", "html_url": "https://github.com/thisisiron", "followers_url": "https://api.github.com/users/thisisiron/followers", "following_url": "https://api.github.com/users/thisisiron/following{/other_user}", "gists_url": "https://api.github.com/users/thisisiron/gists{/gist_id}", "starred_url": "https://api.github.com/users/thisisiron/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thisisiron/subscriptions", "organizations_url": "https://api.github.com/users/thisisiron/orgs", "repos_url": "https://api.github.com/users/thisisiron/repos", "events_url": "https://api.github.com/users/thisisiron/events{/privacy}", "received_events_url": "https://api.github.com/users/thisisiron/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T16:28:33
2025-07-23T17:36:58
2025-07-16T12:28:03
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39432", "html_url": "https://github.com/huggingface/transformers/pull/39432", "diff_url": "https://github.com/huggingface/transformers/pull/39432.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39432.patch", "merged_at": "2025-07-16T12:28:02" }
# What does this PR do? Fixes an AttributeError caused by attempting to access generation_kwargs from a GenerationConfig object in `convert_janus_weights_to_hf.py`. When running the conversion script, the following error occurs: ```bash Traceback (most recent call last): File "/home/infidea/xxx/code/tmp/transformers/src/transformers/models/janus/convert_janus_weights_to_hf.py", line 504, in <module> main() File "/home/infidea/xxx/code/tmp/transformers/src/transformers/models/janus/convert_janus_weights_to_hf.py", line 492, in main convert_model( File "/home/infidea/xxx/code/tmp/transformers/src/transformers/models/janus/convert_janus_weights_to_hf.py", line 413, in convert_model model.generation_config.generation_kwargs["boi_token_id"] = tokenizer.vocab.get("<begin_of_image>") AttributeError: 'GenerationConfig' object has no attribute 'generation_kwargs' ``` Changed the code from: ``` model.generation_config.generation_kwargs["boi_token_id"] = tokenizer.vocab.get("<begin_of_image>") ``` to: ``` model.generation_config.boi_token_id = tokenizer.vocab.get("<begin_of_image>") ``` This resolves the error by directly assigning the custom token ID as an attribute of the GenerationConfig object. ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @amyeroberts, @qubvel
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39432/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39432/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39431
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39431/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39431/comments
https://api.github.com/repos/huggingface/transformers/issues/39431/events
https://github.com/huggingface/transformers/pull/39431
3,232,767,566
PR_kwDOCUB6oc6fBCAt
39,431
🔴 Fix EnCodec internals and integration tests
{ "login": "ebezzam", "id": 4757445, "node_id": "MDQ6VXNlcjQ3NTc0NDU=", "avatar_url": "https://avatars.githubusercontent.com/u/4757445?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ebezzam", "html_url": "https://github.com/ebezzam", "followers_url": "https://api.github.com/users/ebezzam/followers", "following_url": "https://api.github.com/users/ebezzam/following{/other_user}", "gists_url": "https://api.github.com/users/ebezzam/gists{/gist_id}", "starred_url": "https://api.github.com/users/ebezzam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ebezzam/subscriptions", "organizations_url": "https://api.github.com/users/ebezzam/orgs", "repos_url": "https://api.github.com/users/ebezzam/repos", "events_url": "https://api.github.com/users/ebezzam/events{/privacy}", "received_events_url": "https://api.github.com/users/ebezzam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-15T15:36:42
2025-07-23T17:39:32
2025-07-23T17:39:27
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39431", "html_url": "https://github.com/huggingface/transformers/pull/39431", "diff_url": "https://github.com/huggingface/transformers/pull/39431.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39431.patch", "merged_at": "2025-07-23T17:39:27" }
# What does this PR do? Fixes [integration tests](https://github.com/huggingface/transformers/actions/runs/16282903241/job/45976159004) for EnCodec, which were all failing: - Two because of mismatch with expected output - One doesn't even compute outputs because of bad padding ``` FAILED tests/models/encodec/test_modeling_encodec.py::EncodecIntegrationTest::test_batch_48kHz - NotImplementedError: Only 2D, 3D, 4D, 5D padding with non-constant padding are supported for now ``` cc @eustlb --- # Following changes were necessary: - Use chunking from [original model](https://github.com/facebookresearch/encodec/blob/0e2d0aed29362c8e8f52494baf3e6f99056b214f/encodec/model.py#L142). This seems to resolve an [issue](https://github.com/huggingface/transformers/pull/23655#issuecomment-1585724859) in the model addition PR of padding in the 48khz model, which doesn't seem like it was resolved. This also required adding a new output `last_frame_pad_length`. - [This PR](https://github.com/huggingface/transformers/pull/34756) introduced an error by adding a dimension to padding mask. This would cause an issue when calling the `forward` method, as `padding_mask` would be [initialized](https://github.com/huggingface/transformers/blob/53c9dcd6fd31cb9e8a10248693a905d0223b8316/src/transformers/models/encodec/modeling_encodec.py#L761) with the correct shape. Scripts for new expected outputs can be found here: https://gist.github.com/ebezzam/2a34e249e729881130d1f5a42229d31f --- # 🔴 Breaking change (minimal) Previous code will break if people use `return_dict=False` when calling [encode](https://github.com/huggingface/transformers/blob/4b4f04fccaaa3020c5462cf31d286d83fbfc6d38/src/transformers/models/encodec/modeling_encodec.py#L527) and they expect just TWO outputs, as now there are THREE. ``` # old (which would break with current change) audio_codes, audio_scales = model.encode( input_values, padding_mask, bandwidth, return_dict=False ) # new audio_codes, audio_scales, last_frame_pad_length = model.encode( input_values, padding_mask, bandwidth, return_dict=False ) # works for both, and there would be new entry "last_frame_pad_length" in output dict encoded_frames = model.encode( input_values, padding_mask, bandwidth, return_dict=True ) ``` If users don't (or forget) to pass the new output `last_frame_pad_length` to [decode](https://github.com/ebezzam/transformers/blob/15d44de835d8024472e34713250fc5a403a4d220/src/transformers/models/encodec/modeling_encodec.py#L662), it won't break but may lead to slightly different outputs, but only at the end of the audio.
{ "login": "ebezzam", "id": 4757445, "node_id": "MDQ6VXNlcjQ3NTc0NDU=", "avatar_url": "https://avatars.githubusercontent.com/u/4757445?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ebezzam", "html_url": "https://github.com/ebezzam", "followers_url": "https://api.github.com/users/ebezzam/followers", "following_url": "https://api.github.com/users/ebezzam/following{/other_user}", "gists_url": "https://api.github.com/users/ebezzam/gists{/gist_id}", "starred_url": "https://api.github.com/users/ebezzam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ebezzam/subscriptions", "organizations_url": "https://api.github.com/users/ebezzam/orgs", "repos_url": "https://api.github.com/users/ebezzam/repos", "events_url": "https://api.github.com/users/ebezzam/events{/privacy}", "received_events_url": "https://api.github.com/users/ebezzam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39431/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39431/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39430
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39430/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39430/comments
https://api.github.com/repos/huggingface/transformers/issues/39430/events
https://github.com/huggingface/transformers/issues/39430
3,232,704,083
I_kwDOCUB6oc7AryZT
39,430
Add DiCoW: Diarization-Conditioned Whisper
{ "login": "Lakoc", "id": 61480290, "node_id": "MDQ6VXNlcjYxNDgwMjkw", "avatar_url": "https://avatars.githubusercontent.com/u/61480290?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Lakoc", "html_url": "https://github.com/Lakoc", "followers_url": "https://api.github.com/users/Lakoc/followers", "following_url": "https://api.github.com/users/Lakoc/following{/other_user}", "gists_url": "https://api.github.com/users/Lakoc/gists{/gist_id}", "starred_url": "https://api.github.com/users/Lakoc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lakoc/subscriptions", "organizations_url": "https://api.github.com/users/Lakoc/orgs", "repos_url": "https://api.github.com/users/Lakoc/repos", "events_url": "https://api.github.com/users/Lakoc/events{/privacy}", "received_events_url": "https://api.github.com/users/Lakoc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" }, { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-15T15:20:21
2025-09-19T22:07:07
2025-09-19T22:07:07
NONE
null
null
null
null
### Model description We (BUT Speech@FIT) have recently developed DiCoW (Diarization-Conditioned Whisper), a target-speaker ASR model that enhances OpenAI’s Whisper by integrating speaker diarization for multi-talker, speaker-attributed ASR. Unlike previous approaches, DiCoW directly conditions on diarization outputs and achieves state-of-the-art performance on multi-talker benchmarks such as AMI and Libri2Mix. The model recently secured second place in the Challenge and Workshop on Multilingual Conversational Speech Language Model (MLC-SLM) and received a jury award at the CHIME-8 challenge. DiCoW employs Frame-Level Diarization-Dependent Transformations (FDDT), applying frame-wise projections of different embeddings—Silence, Target speaker, Non-target speaker, and Overlap with target—based on diarization outputs. Designed for long-form, multi-speaker transcription tasks, DiCoW excels in scenarios such as meetings, interviews, and spontaneous conversations. It also performs well for single-speaker ASR, achieving Word Error Rates (WER) of 2.1 on LibriSpeech test-clean, 4.3 on test-other, 5.3 on TED-LIUM, and 11.2 on VoxPopuli. The model is based on Whisper and the [v3.2](https://huggingface.co/BUT-FIT/DiCoW_v3_2) version is already integrated with the Hugging Face Transformers AutoClasses. ### Open source status - [x] The model implementation is available - [x] The model weights are available ### Provide useful links for the implementation Source Repositories * [Training Code: TS-ASR-Whisper](https://github.com/BUTSpeechFIT/TS-ASR-Whisper) * [Inference Code: DiCoW](https://github.com/BUTSpeechFIT/DiCoW) Related Publications * *DiCoW: Diarization-Conditioned Whisper for Target Speaker Automatic Speech Recognition* [Computer Speech & Language, 2025](https://www.sciencedirect.com/science/article/pii/S088523082500066X) * *Target Speaker ASR with Whisper* [IEEE ICASSP 2025](https://doi.org/10.1109/ICASSP49660.2025.10887683) * *BUT/JHU System Description for CHiME-8 NOTSOFAR-1 Challenge* [CHiME 2024 Proceedings](https://doi.org/10.21437/CHiME.2024-4) * *BUT System for the MLC-SLM* [arXiv:2506.13414](https://arxiv.org/abs/2506.13414)
{ "login": "Lakoc", "id": 61480290, "node_id": "MDQ6VXNlcjYxNDgwMjkw", "avatar_url": "https://avatars.githubusercontent.com/u/61480290?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Lakoc", "html_url": "https://github.com/Lakoc", "followers_url": "https://api.github.com/users/Lakoc/followers", "following_url": "https://api.github.com/users/Lakoc/following{/other_user}", "gists_url": "https://api.github.com/users/Lakoc/gists{/gist_id}", "starred_url": "https://api.github.com/users/Lakoc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lakoc/subscriptions", "organizations_url": "https://api.github.com/users/Lakoc/orgs", "repos_url": "https://api.github.com/users/Lakoc/repos", "events_url": "https://api.github.com/users/Lakoc/events{/privacy}", "received_events_url": "https://api.github.com/users/Lakoc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39430/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39430/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39429
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39429/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39429/comments
https://api.github.com/repos/huggingface/transformers/issues/39429/events
https://github.com/huggingface/transformers/pull/39429
3,232,683,668
PR_kwDOCUB6oc6fAvZT
39,429
Add voxtral
{ "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-15T15:14:21
2025-07-18T08:53:00
2025-07-18T00:02:05
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39429", "html_url": "https://github.com/huggingface/transformers/pull/39429", "diff_url": "https://github.com/huggingface/transformers/pull/39429.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39429.patch", "merged_at": "2025-07-18T00:02:05" }
# What does this PR do? Adds mistral's audio-text-to-text model 🤗 ## EVALS Ran transcriptions mode eval on librispeech test-clean ([this gist](https://gist.github.com/eustlb/fc899dd9ff7fe357065c35cb0120d145)) - 3B: 1.9% wer - 24B: 1.6% wer
{ "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39429/reactions", "total_count": 8, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 8, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39429/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39428
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39428/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39428/comments
https://api.github.com/repos/huggingface/transformers/issues/39428/events
https://github.com/huggingface/transformers/pull/39428
3,232,645,339
PR_kwDOCUB6oc6fAm_6
39,428
Improve grammar and clarity in perf_hardware.md
{ "login": "ridima11", "id": 153719980, "node_id": "U_kgDOCSmUrA", "avatar_url": "https://avatars.githubusercontent.com/u/153719980?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ridima11", "html_url": "https://github.com/ridima11", "followers_url": "https://api.github.com/users/ridima11/followers", "following_url": "https://api.github.com/users/ridima11/following{/other_user}", "gists_url": "https://api.github.com/users/ridima11/gists{/gist_id}", "starred_url": "https://api.github.com/users/ridima11/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ridima11/subscriptions", "organizations_url": "https://api.github.com/users/ridima11/orgs", "repos_url": "https://api.github.com/users/ridima11/repos", "events_url": "https://api.github.com/users/ridima11/events{/privacy}", "received_events_url": "https://api.github.com/users/ridima11/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T15:04:15
2025-07-16T19:15:16
2025-07-16T19:15:16
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39428", "html_url": "https://github.com/huggingface/transformers/pull/39428", "diff_url": "https://github.com/huggingface/transformers/pull/39428.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39428.patch", "merged_at": "2025-07-16T19:15:16" }
This PR improves the readability and corrects grammatical issues in the perf_hardware.md documentation file: - Fixed subject-verb agreement and pluralization issues. - Rephrased awkward sentences for better clarity. - Improved formatting of temperature ranges and technical terminology. - Simplified the explanation about pigtail cables. These changes enhance the documentation’s professional tone and make it easier for readers to follow.
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39428/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39428/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39427
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39427/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39427/comments
https://api.github.com/repos/huggingface/transformers/issues/39427/events
https://github.com/huggingface/transformers/issues/39427
3,232,610,783
I_kwDOCUB6oc7Arbnf
39,427
Gemma 3 Compilation Issues During Generation
{ "login": "mitchelldehaven", "id": 47208251, "node_id": "MDQ6VXNlcjQ3MjA4MjUx", "avatar_url": "https://avatars.githubusercontent.com/u/47208251?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mitchelldehaven", "html_url": "https://github.com/mitchelldehaven", "followers_url": "https://api.github.com/users/mitchelldehaven/followers", "following_url": "https://api.github.com/users/mitchelldehaven/following{/other_user}", "gists_url": "https://api.github.com/users/mitchelldehaven/gists{/gist_id}", "starred_url": "https://api.github.com/users/mitchelldehaven/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mitchelldehaven/subscriptions", "organizations_url": "https://api.github.com/users/mitchelldehaven/orgs", "repos_url": "https://api.github.com/users/mitchelldehaven/repos", "events_url": "https://api.github.com/users/mitchelldehaven/events{/privacy}", "received_events_url": "https://api.github.com/users/mitchelldehaven/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-15T14:55:06
2025-09-23T08:03:26
2025-09-23T08:03:26
NONE
null
null
null
null
### System Info When using the `google/gemma-3-1b-it`, I run into frequent recompilation issues, eventually resulting in the following error message: ``` torch._dynamo.exc.FailOnRecompileLimitHit: recompile_limit reached with one_graph=True. Excessive recompilations can degrade performance due to the compilation overhead of each recompilation. To monitor recompilations, enable TORCH_LOGS=recompiles. If recompilations are expected, consider increasing torch._dynamo.config.cache_size_limit to an appropriate value ``` There is a simple workaround of sorting the inputs by length (longest first, trying to minimize graph recompilation due to increasing input sizes). However, that seems like a hacky workaround. Is this expected behavior or can the compilation be disabled. ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Steps to reproduce: ``` from transformers import Gemma3ForConditionalGeneration, AutoTokenizer, AutoModelForCausalLM, GenerationConfig import datasets import json import torch from tqdm import tqdm def tools_format_python(tool_dicts): functions_formatted = [] tab_s = " " for tool_dict in tool_dicts: function_name = tool_dict["name"] function_args = [f"{k}: {v['type']}" for k, v in tool_dict["parameters"].items()] function_formatted = ( f"def {function_name}({', '.join(function_args)})\n" + f"{tab_s}{tool_dict['description']}\n\n" + f"{tab_s}Args:\n" + "\n".join(f"{tab_s*2}{k}: {v['description']}" for k, v in tool_dict["parameters"].items()) ) functions_formatted.append(function_formatted) tools_prompt = ( "```python\n" + "\n\n".join(functions_formatted) + "\n```\n\n" ) return tools_prompt llm_model_name = "google/gemma-3-1b-it" device = "cuda" if torch.cuda.is_available() else "cpu" model = AutoModelForCausalLM.from_pretrained(llm_model_name).to(device) tokenizer = AutoTokenizer.from_pretrained(llm_model_name) dataset = datasets.load_dataset("Salesforce/xlam-function-calling-60k")["train"] system_message = "You are a helpful function calling AI assistant. Below are the Python functions accessible to you:\n" for sample in tqdm(dataset): sample_tools = json.loads(sample["tools"]) tools_prompt = tools_format_python(sample_tools) gemma_prompt_format = ( "<bos><start_of_turn>user\n" + system_message + tools_prompt + "User: " + sample["query"] + "<end_of_turn>\n<start_of_turn>model\n```tool_call\n" ) tokenized_prompt = tokenizer(gemma_prompt_format, return_tensors="pt", add_special_tokens=False)["input_ids"].to(device) with torch.no_grad(), torch.amp.autocast(device, dtype=torch.bfloat16): outputs = model.generate( tokenized_prompt, max_new_tokens=256, use_cache=True, ) ``` ### Expected behavior That dynamic input shapes should not result in the program eventually failing.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39427/reactions", "total_count": 6, "+1": 6, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39427/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39426
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39426/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39426/comments
https://api.github.com/repos/huggingface/transformers/issues/39426/events
https://github.com/huggingface/transformers/issues/39426
3,232,288,269
I_kwDOCUB6oc7AqM4N
39,426
object detection : matchin outputs.last_hidden_state with results
{ "login": "fenaux", "id": 44709416, "node_id": "MDQ6VXNlcjQ0NzA5NDE2", "avatar_url": "https://avatars.githubusercontent.com/u/44709416?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fenaux", "html_url": "https://github.com/fenaux", "followers_url": "https://api.github.com/users/fenaux/followers", "following_url": "https://api.github.com/users/fenaux/following{/other_user}", "gists_url": "https://api.github.com/users/fenaux/gists{/gist_id}", "starred_url": "https://api.github.com/users/fenaux/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fenaux/subscriptions", "organizations_url": "https://api.github.com/users/fenaux/orgs", "repos_url": "https://api.github.com/users/fenaux/repos", "events_url": "https://api.github.com/users/fenaux/events{/privacy}", "received_events_url": "https://api.github.com/users/fenaux/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-07-15T13:34:08
2025-07-22T11:08:23
null
NONE
null
null
null
null
### Feature request it seems to me that would be possible with a little modification in the function post_process_object_detection with ``` ``for score, label, box, index in zip(scores, labels, boxes, indexes): results.append( { "scores": score[score > threshold], "labels": label[score > threshold], "boxes": box[score > threshold], "indexes": index[score > threshold], } )`` ``` and then `outputs.last_hidden_state[0][results[0]['indexes']] ` gives me the desired vector features Am I right or is there a better way to obtain this matching ? Thanks for your help ### Motivation I would like to use outputs.last_hidden_state as features for auxiliary tasks. So I need to know the label and the bounding box associated to one given vector of outputs.last_hidden_state ### Your contribution I am not a top coder and do not know how to submit a PR
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39426/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39426/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39425
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39425/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39425/comments
https://api.github.com/repos/huggingface/transformers/issues/39425/events
https://github.com/huggingface/transformers/pull/39425
3,232,112,957
PR_kwDOCUB6oc6e-wSy
39,425
Fix bugs from pipeline preprocessor overhaul
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T12:43:28
2025-07-15T13:29:01
2025-07-15T13:29:00
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39425", "html_url": "https://github.com/huggingface/transformers/pull/39425", "diff_url": "https://github.com/huggingface/transformers/pull/39425.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39425.patch", "merged_at": "2025-07-15T13:29:00" }
I overhauled pipeline processor loading in #38947, which caused some test failures. They should mostly be fixed by the user PR at #39411, but the remaining issues are resolved here.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39425/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39425/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39424
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39424/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39424/comments
https://api.github.com/repos/huggingface/transformers/issues/39424/events
https://github.com/huggingface/transformers/issues/39424
3,232,079,234
I_kwDOCUB6oc7ApZ2C
39,424
[YosoConfig] Missing `architectures` field
{ "login": "SankethHonavar", "id": 125574395, "node_id": "U_kgDOB3wc-w", "avatar_url": "https://avatars.githubusercontent.com/u/125574395?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SankethHonavar", "html_url": "https://github.com/SankethHonavar", "followers_url": "https://api.github.com/users/SankethHonavar/followers", "following_url": "https://api.github.com/users/SankethHonavar/following{/other_user}", "gists_url": "https://api.github.com/users/SankethHonavar/gists{/gist_id}", "starred_url": "https://api.github.com/users/SankethHonavar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SankethHonavar/subscriptions", "organizations_url": "https://api.github.com/users/SankethHonavar/orgs", "repos_url": "https://api.github.com/users/SankethHonavar/repos", "events_url": "https://api.github.com/users/SankethHonavar/events{/privacy}", "received_events_url": "https://api.github.com/users/SankethHonavar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T12:32:04
2025-07-16T12:04:04
2025-07-16T12:04:04
NONE
null
null
null
null
While contributing to the project and running the `add_missing_architectures.py` utility, I found that `YosoConfig` was missing the `architectures` field. This issue tracks the fix to add the following field to: 📄 `src/transformers/models/yoso/configuration_yoso.py` ```python architectures = ["YosoForMaskedLM"]
{ "login": "SankethHonavar", "id": 125574395, "node_id": "U_kgDOB3wc-w", "avatar_url": "https://avatars.githubusercontent.com/u/125574395?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SankethHonavar", "html_url": "https://github.com/SankethHonavar", "followers_url": "https://api.github.com/users/SankethHonavar/followers", "following_url": "https://api.github.com/users/SankethHonavar/following{/other_user}", "gists_url": "https://api.github.com/users/SankethHonavar/gists{/gist_id}", "starred_url": "https://api.github.com/users/SankethHonavar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SankethHonavar/subscriptions", "organizations_url": "https://api.github.com/users/SankethHonavar/orgs", "repos_url": "https://api.github.com/users/SankethHonavar/repos", "events_url": "https://api.github.com/users/SankethHonavar/events{/privacy}", "received_events_url": "https://api.github.com/users/SankethHonavar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39424/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39424/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39423
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39423/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39423/comments
https://api.github.com/repos/huggingface/transformers/issues/39423/events
https://github.com/huggingface/transformers/pull/39423
3,231,967,793
PR_kwDOCUB6oc6e-Qzx
39,423
🚨🚨 Fix and simplify attention implementation dispatch and subconfigs handling
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T11:53:24
2025-07-18T11:41:56
2025-07-18T11:41:54
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39423", "html_url": "https://github.com/huggingface/transformers/pull/39423", "diff_url": "https://github.com/huggingface/transformers/pull/39423.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39423.patch", "merged_at": "2025-07-18T11:41:54" }
# What does this PR do? A few issues: - the checks and attention switch should be performed recursively for all submodels, not only the "top-most" model - flash can dispatch even with cpu/disk offloading, as the module is put back on device prior to computation - some models with the old class attention interface should not be allowed to switch Because some checks need to be performed when the model is fully instantiated (i.e. params on GPU for FA2/3, presence of BetterTransformer or not), we now perform 2 checks independently: - first check very early, at __init__ time -> crash immediately if the attn is not supported/not installed, instead of waiting 20s to instantiate the model and then crash because a static flag is False - full check when switching attention dynamically, as the model is then already initialized correctly Also, when switching dynamically, let's not crash when something is not available -> a warning seem much more appropriate as otherwise we crash everything for not much. Also simplified the internal workings of the `attn_implementation` for the Configs -> removed some internal flags as they are basically useless and bloat the code a lot 🚨🚨🥵🥵 ALSO, DISCOVERED A HUGE BUG: due to deepcopy of `config` in `_from_config`, all our getter/setter when doing stuff such as `model.config.get_text_config()` would be wrong, because internal model has copied the config, which is thus not the same as the external config subconfig... This results in wrong dispatch/usage of attn_implementation everywhere we have submodels. This PR removes the deepcopy, thus fixing it -> so using `_from_config`/`from_config` is similar to `model_class(config)`, in the sense that it does not copy the config (which is better anyway, as it's more coherent). `from_pretrained` still deepcopies the config if passed To better understand the issue at hand, consider the following: ```python from transformers import Qwen2_5_VLForConditionalGeneration, AutoConfig model = Qwen2_5_VLForConditionalGeneration.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct") print(model.model.language_model.config._attn_implementation) >>> sdpa print(model.config.get_text_config()._attn_implementation) >>> eager # should be sdpa here!!!! ``` That is, we have an inconsistency between the 2 objects (because due to the deepcopy, they are no longer the same). This is MOST NOTABLY an issue when we create the mask, where we use `get_text_config` to know which attn to use, but then we will use the one based on the model's internal config. By pure luck, sdpa can actually use an eager mask as torch supports both format, so it's not an issue with our defaults. It would be with other `attn_implementation` though, if trying to set it explicitly!!!
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39423/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39423/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39422
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39422/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39422/comments
https://api.github.com/repos/huggingface/transformers/issues/39422/events
https://github.com/huggingface/transformers/pull/39422
3,231,927,121
PR_kwDOCUB6oc6e-H-4
39,422
refactor: remove `set_tracer_provider` and `set_meter_provider` calls
{ "login": "McPatate", "id": 9112841, "node_id": "MDQ6VXNlcjkxMTI4NDE=", "avatar_url": "https://avatars.githubusercontent.com/u/9112841?v=4", "gravatar_id": "", "url": "https://api.github.com/users/McPatate", "html_url": "https://github.com/McPatate", "followers_url": "https://api.github.com/users/McPatate/followers", "following_url": "https://api.github.com/users/McPatate/following{/other_user}", "gists_url": "https://api.github.com/users/McPatate/gists{/gist_id}", "starred_url": "https://api.github.com/users/McPatate/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/McPatate/subscriptions", "organizations_url": "https://api.github.com/users/McPatate/orgs", "repos_url": "https://api.github.com/users/McPatate/repos", "events_url": "https://api.github.com/users/McPatate/events{/privacy}", "received_events_url": "https://api.github.com/users/McPatate/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 8103865784, "node_id": "LA_kwDOCUB6oc8AAAAB4wctuA", "url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch", "name": "for patch", "color": "D93F0B", "default": false, "description": "Tag issues / labels that should be included in the next patch" } ]
closed
false
null
[]
null
[]
2025-07-15T11:38:26
2025-07-21T09:42:40
2025-07-15T12:22:12
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39422", "html_url": "https://github.com/huggingface/transformers/pull/39422", "diff_url": "https://github.com/huggingface/transformers/pull/39422.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39422.patch", "merged_at": "2025-07-15T12:22:12" }
# What does this PR do? As we are a library, we shouldn't set an OT tracer or a meter, given you can only set one per application. Removing calls to `set_{tracer|meter}_provider`. Closes #39115, #39143
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39422/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39422/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39421
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39421/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39421/comments
https://api.github.com/repos/huggingface/transformers/issues/39421/events
https://github.com/huggingface/transformers/issues/39421
3,231,922,248
I_kwDOCUB6oc7AozhI
39,421
Speculative Decoding(do_sample=False) get different outputs
{ "login": "nighty8", "id": 78900223, "node_id": "MDQ6VXNlcjc4OTAwMjIz", "avatar_url": "https://avatars.githubusercontent.com/u/78900223?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nighty8", "html_url": "https://github.com/nighty8", "followers_url": "https://api.github.com/users/nighty8/followers", "following_url": "https://api.github.com/users/nighty8/following{/other_user}", "gists_url": "https://api.github.com/users/nighty8/gists{/gist_id}", "starred_url": "https://api.github.com/users/nighty8/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nighty8/subscriptions", "organizations_url": "https://api.github.com/users/nighty8/orgs", "repos_url": "https://api.github.com/users/nighty8/repos", "events_url": "https://api.github.com/users/nighty8/events{/privacy}", "received_events_url": "https://api.github.com/users/nighty8/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T11:36:31
2025-07-19T03:11:04
2025-07-18T13:27:49
NONE
null
null
null
null
> @transcend-0 hey! > > > > The issue was solved in [#30068](https://github.com/huggingface/transformers/pull/30068). You can install transformers from `main` with the following line for the correct generation with assisted decoding: > > > > `!pip install --upgrade git+https://github.com/huggingface/transformers.git` _Originally posted by @zucchini-nlp in [#30608](https://github.com/huggingface/transformers/issues/30608#issuecomment-2089846816)_ ### **System Info** Python 3.10.11 transformers 4.49.0 torch 2.6.0+cu124 ### **Same Reproduction** Target_Model = Qwen2.5-32B-Instruct Draft_Model = Qwen2.5-7B-Instruct `question = "Dienes are organic compounds with two adjacent double bonds in their structure, and they exhibit unique reactivity due to their conjugated pi-electron system. They play a significant role in organic chemistry and are involved in various chemical reactions and natural processes.\nAmong the given options which one is the possible reactant (A) for the given reaction also mention the correct sequence of the dienes according to their reactivity ( most reactive to least reactive) B.\nCyclohexene + A ---> 8,8-diiodobicyclo[4.2.0]octan-7-one\n(B) 1. 2,3-dimethylbuta-1,3-diene, 2. (2E,4E)-hexa-2,4-diene, 3. (2E,4E)-hexa-2,4-diene, 4. (2Z,4Z)-hexa-2,4-diene\n\n\nA. A = 2,2-diiodoethen-1-one, B = 3, 1, 2, 4\nB. A = 2,2-diiodoethen-1-one, B = 4, 2, 1, 3\nC. A = 4,4-diiodocyclobut-2-en-1-one, B = 3, 1, 2, 4\nD. A = 4,4-diiodocyclobut-2-en-1-one, B = 4, 2, 1, 3\n\n"` `prompt = '<|im_start|>user' + question + 'Please reason step-by-step and put your choice letter without any other text with \\boxed{} in the end.'` `['userDienes are organic compounds with two adjacent double bonds in their structure, and they exhibit unique reactivity due to their conjugated pi-electron system. They play a significant role in organic chemistry and are involved in various chemical reactions and natural processes.\nAmong the given options which one is the possible reactant (A) for the given reaction also mention the correct sequence of the dienes according to their reactivity ( most reactive to least reactive) B.\nCyclohexene + A ---> 8,8-diiodobicyclo[4.2.0]octan-7-one\n(B) 1. 2,3-dimethylbuta-1,3-diene, 2. (2E,4E)-hexa-2,4-diene, 3. (2E,4E)-hexa-2,4-diene, 4. (2Z,4Z)-hexa-2,4-diene\n\n\nA. A = 2,2-diiodoethen-1-one, B = 3, 1, 2, 4\nB. A = 2,2-diiodoethen-1-one, B = 4, 2, 1, 3\nC. A = 4,4-diiodocyclobut-2-en-1-one, B = 3, 1, 2, 4\nD. A = 4,4-diiodocyclobut-2-en-1-one, B = 4, 2, 1, 3\n\nPlease reason step-by-step and put your choice letter without any other text with \\boxed{} in the end. To solve this problem, we need to identify the reactant \\( A \\) that can react with cyclohexene to form 8,8-diiodobicyclo[4.2.0]octan-7-one. We also need to determine the correct sequence of the dienes according to their reactivity from most reactive to least reactive.\n\n### Step-by-Step Reasoning:\n\n1. **Identify the Product:**\n - The product is 8,8-diiodobicyclo[4.2.0]octan-7-one. This suggests that the reactant \\( A \\) must be a compound that can undergo a Diels-Alder reaction with cyclohexene to form the bicyclic structure and then iodination at the appropriate positions.\n\n2. **Reactant Identification:**\n - The reactant \\( A \\) should be a dienophile (a compound with a double bond that can participate in a Diels-Alder reaction). Among the given options, the possible candidates are:\n - 2,2-diiodoethen-1-one\n - 4,4-diiodocyclobut-2-en-1-one\n\n3. **Diels-Alder Reaction:**\n - Cyclohexene is a diene, and it will react with a dienophile to form a bicyclic structure. The dienophile should have a double bond that can react with the diene to form the desired product.\n - 2,2-diiodoethen-1-one has a double bond and iodine substituents, making it a suitable dienophile.\n - 4,4-diiodocyclobut-2-en-1-one also has a double bond but is more complex and less likely to form the desired product directly.\n\n4. **Sequence of Dienes According to Reactivity:**\n - The reactivity of dienes depends on the stability of the conjugated pi-electron system.\n - Generally, the order of reactivity from most reactive to least reactive is:\n 1. (2E,4E)-hexa-2,4-diene (most stable and reactive)\n 2. (2E,4E)-hexa-2,4-diene (same as above)\n 3. 2,3-dimethylbuta-1,3-diene (less stable due to steric hindrance)\n 4. (2Z,4Z)-hexa-2,4-diene (least stable due to cis configuration)\n\n5. **Matching Options:**\n - Option A: \\( A = 2,2 \\)-diiodoethen-1-one, B = 3, 1, 2, 4\n - Option B: \\( A = 2,2 \\)-diiodoethen-1-one, B = 4, 2, 1, 3\n - Option C: \\( A = 4,4 \\)-diiodocyclobut-2-en-1-one, B = 3, 1, 2, 4\n - Option D: \\( A = 4,4 \\)-diiodocyclobut-2-en-1-one, B = 4, 2, 1, 3\n\nGiven the correct sequence of dienes and the suitable dienophile, the correct option is:\n\n\\boxed{A}']` - targetDecoding - Running time: 41.82 s` `['userDienes are organic compounds with two adjacent double bonds in their structure, and they exhibit unique reactivity due to their conjugated pi-electron system. They play a significant role in organic chemistry and are involved in various chemical reactions and natural processes.\nAmong the given options which one is the possible reactant (A) for the given reaction also mention the correct sequence of the dienes according to their reactivity ( most reactive to least reactive) B.\nCyclohexene + A ---> 8,8-diiodobicyclo[4.2.0]octan-7-one\n(B) 1. 2,3-dimethylbuta-1,3-diene, 2. (2E,4E)-hexa-2,4-diene, 3. (2E,4E)-hexa-2,4-diene, 4. (2Z,4Z)-hexa-2,4-diene\n\n\nA. A = 2,2-diiodoethen-1-one, B = 3, 1, 2, 4\nB. A = 2,2-diiodoethen-1-one, B = 4, 2, 1, 3\nC. A = 4,4-diiodocyclobut-2-en-1-one, B = 3, 1, 2, 4\nD. A = 4,4-diiodocyclobut-2-en-1-one, B = 4, 2, 1, 3\n\nPlease reason step-by-step and put your choice letter without any other text with \\boxed{} in the end. To solve this problem, we need to identify the reactant \\( A \\) that would lead to the formation of 8,8-diiodobicyclo[4.2.0]octan-7-one when reacted with cyclohexene. We also need to determine the correct sequence of the dienes according to their reactivity from most reactive to least reactive.\n\n### Step-by-Step Reasoning:\n\n1. **Identify the Product:**\n - The product is 8,8-diiodobicyclo[4.2.0]octan-7-one. This suggests that the reactant \\( A \\) must be a compound that can form a bicyclic structure with cyclohexene and incorporate iodine atoms.\n\n2. **Reactant Identification:**\n - Given the options, we need to choose between 2,2-diiodoethen-1-one and 4,4-diiodocyclobut-2-en-1-one.\n - 2,2-diiodoethen-1-one is a simpler molecule compared to 4,4-diiodocyclobut-2-en-1-one.\n - 4,4-diiodocyclobut-2-en-1-one has a cyclic structure which could potentially form a bicyclic structure with cyclohexene more readily.\n\n3. **Sequence of Dienes According to Reactivity:**\n - The dienes provided are:\n 1. 2,3-dimethylbuta-1,3-diene\n 2. (2E,4E)-hexa-2,4-diene\n 3. (2E,4E)-hexa-2,4-diene (repeated)\n 4. (2Z,4Z)-hexa-2,4-diene\n - Generally, trans (E) isomers are more reactive than cis (Z) isomers due to better overlap of the p-orbitals.\n - Therefore, the sequence from most reactive to least reactive should be:\n 1. (2E,4E)-hexa-2,4-diene (twice)\n 2. 2,3-dimethylbuta-1,3-diene\n 3. (2Z,4Z)-hexa-2,4-diene\n\nGiven these points, the correct reactant \\( A \\) is 4,4-diiodocyclobut-2-en-1-one, and the sequence of dienes according to reactivity is 3, 1, 2, 4.\n\nThus, the correct answer is:\n\n\\boxed{C}']` - speculativeDecoding - Running time: 29.14 s
{ "login": "nighty8", "id": 78900223, "node_id": "MDQ6VXNlcjc4OTAwMjIz", "avatar_url": "https://avatars.githubusercontent.com/u/78900223?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nighty8", "html_url": "https://github.com/nighty8", "followers_url": "https://api.github.com/users/nighty8/followers", "following_url": "https://api.github.com/users/nighty8/following{/other_user}", "gists_url": "https://api.github.com/users/nighty8/gists{/gist_id}", "starred_url": "https://api.github.com/users/nighty8/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nighty8/subscriptions", "organizations_url": "https://api.github.com/users/nighty8/orgs", "repos_url": "https://api.github.com/users/nighty8/repos", "events_url": "https://api.github.com/users/nighty8/events{/privacy}", "received_events_url": "https://api.github.com/users/nighty8/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39421/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39421/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39420
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39420/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39420/comments
https://api.github.com/repos/huggingface/transformers/issues/39420/events
https://github.com/huggingface/transformers/pull/39420
3,231,759,005
PR_kwDOCUB6oc6e9ji3
39,420
[autodocstring] add video and audio inputs
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T10:41:47
2025-07-16T07:41:51
2025-07-16T07:41:51
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39420", "html_url": "https://github.com/huggingface/transformers/pull/39420", "diff_url": "https://github.com/huggingface/transformers/pull/39420.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39420.patch", "merged_at": "2025-07-16T07:41:51" }
# What does this PR do? As per title, I think these are quite common to be in the auto docstring
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39420/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39420/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39419
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39419/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39419/comments
https://api.github.com/repos/huggingface/transformers/issues/39419/events
https://github.com/huggingface/transformers/pull/39419
3,231,715,263
PR_kwDOCUB6oc6e9aEm
39,419
add test scanner
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1834088753, "node_id": "MDU6TGFiZWwxODM0MDg4NzUz", "url": "https://api.github.com/repos/huggingface/transformers/labels/Tests", "name": "Tests", "color": "a6fcca", "default": false, "description": "Related to tests" } ]
closed
false
null
[]
null
[]
2025-07-15T10:26:03
2025-07-16T10:45:48
2025-07-16T10:45:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39419", "html_url": "https://github.com/huggingface/transformers/pull/39419", "diff_url": "https://github.com/huggingface/transformers/pull/39419.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39419.patch", "merged_at": "2025-07-16T10:45:46" }
# What does this PR do? Another day, another tool: when adding a model, tests fail as they should, and sometimes `test_modeling_common` feels irreconcilable with the peculiarities of our brand new model. But how can we be sure we're not breaking everything by adding a seemingly innocent `skip`? Well I'm not 100% sure, but this: - scans all test_modeling_common methods - looks for times where a method is skipped - returns a summary json you can load as a DataFrame/inspect For instance `test_inputs_embeds` is skipped in a whooping 39% proportion! <img width="1880" height="772" alt="image" src="https://github.com/user-attachments/assets/8a0977a2-dc48-4649-9f2c-40b0f029984d" /> So if you're skipping a test and it has never been skipped, maybe ponder a moment. I'm pushing this because I always end up eyeballing in my IDE the number of hits a given test has repo-wide, this should be slightly more robust. Attaching an example scan result as well. cc @ArthurZucker @ydshieh ! [all_tests_scan_result.json](https://github.com/user-attachments/files/21230848/all_tests_scan_result.json)
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39419/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39419/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39418
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39418/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39418/comments
https://api.github.com/repos/huggingface/transformers/issues/39418/events
https://github.com/huggingface/transformers/pull/39418
3,231,699,566
PR_kwDOCUB6oc6e9WqF
39,418
Fix: Add missing 'architectures' field to YosoConfig
{ "login": "SankethHonavar", "id": 125574395, "node_id": "U_kgDOB3wc-w", "avatar_url": "https://avatars.githubusercontent.com/u/125574395?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SankethHonavar", "html_url": "https://github.com/SankethHonavar", "followers_url": "https://api.github.com/users/SankethHonavar/followers", "following_url": "https://api.github.com/users/SankethHonavar/following{/other_user}", "gists_url": "https://api.github.com/users/SankethHonavar/gists{/gist_id}", "starred_url": "https://api.github.com/users/SankethHonavar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SankethHonavar/subscriptions", "organizations_url": "https://api.github.com/users/SankethHonavar/orgs", "repos_url": "https://api.github.com/users/SankethHonavar/repos", "events_url": "https://api.github.com/users/SankethHonavar/events{/privacy}", "received_events_url": "https://api.github.com/users/SankethHonavar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T10:20:24
2025-08-11T16:47:55
2025-07-16T11:38:56
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39418", "html_url": "https://github.com/huggingface/transformers/pull/39418", "diff_url": "https://github.com/huggingface/transformers/pull/39418.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39418.patch", "merged_at": null }
This PR adds the missing `architectures` field to the `YosoConfig` class, as part of the issue [#39208](https://github.com/huggingface/transformers/issues/39208). **Added:** ```python architectures = ["YosoForMaskedLM"] ``` This helps ensure proper model auto-mapping and improves consistency across configuration classes. Fixes #39424 Tagging @Narsil as the maintainer of this part of the repo. Please let me know if any changes are needed!
{ "login": "SankethHonavar", "id": 125574395, "node_id": "U_kgDOB3wc-w", "avatar_url": "https://avatars.githubusercontent.com/u/125574395?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SankethHonavar", "html_url": "https://github.com/SankethHonavar", "followers_url": "https://api.github.com/users/SankethHonavar/followers", "following_url": "https://api.github.com/users/SankethHonavar/following{/other_user}", "gists_url": "https://api.github.com/users/SankethHonavar/gists{/gist_id}", "starred_url": "https://api.github.com/users/SankethHonavar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SankethHonavar/subscriptions", "organizations_url": "https://api.github.com/users/SankethHonavar/orgs", "repos_url": "https://api.github.com/users/SankethHonavar/repos", "events_url": "https://api.github.com/users/SankethHonavar/events{/privacy}", "received_events_url": "https://api.github.com/users/SankethHonavar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39418/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39418/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39416
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39416/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39416/comments
https://api.github.com/repos/huggingface/transformers/issues/39416/events
https://github.com/huggingface/transformers/pull/39416
3,231,548,759
PR_kwDOCUB6oc6e81lq
39,416
fix `kyutai` tests
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T09:36:24
2025-07-25T16:42:05
2025-07-25T16:42:04
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39416", "html_url": "https://github.com/huggingface/transformers/pull/39416", "diff_url": "https://github.com/huggingface/transformers/pull/39416.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39416.patch", "merged_at": "2025-07-25T16:42:04" }
# What does this PR do? > HF_HOME=/mnt/cache HF_HUB_READ_TOKEN=hf_XXX RUN_SLOW=1 python3 -m pytest -v --flake-finder --flake-runs=10 tests/models/kyutai_speech_to_text/test_modeling_kyutai_speech_to_text.py::KyutaiSpeechToTextForConditionalGenerationIntegrationTests::test_generation_batched It might be from a numerical stability issue, only 3 elements are different. ``` (Pdb) EXPECTED_TOKENS[2, 157:164] tensor([3232, 3, 0, 269, 3, 3, 0]) (Pdb) out.cpu()[2, 157:164] tensor([3232, 3, 3, 0, 269, 3, 0]) (Pdb) ```
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39416/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39416/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39415
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39415/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39415/comments
https://api.github.com/repos/huggingface/transformers/issues/39415/events
https://github.com/huggingface/transformers/pull/39415
3,231,153,479
PR_kwDOCUB6oc6e7fhS
39,415
[chat template] add a testcase for kwargs
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T07:33:07
2025-07-16T09:31:35
2025-07-16T09:31:35
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39415", "html_url": "https://github.com/huggingface/transformers/pull/39415", "diff_url": "https://github.com/huggingface/transformers/pull/39415.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39415.patch", "merged_at": "2025-07-16T09:31:35" }
# What does this PR do? Just adds a testcase, I didn't know this was supported and I don't want it to break in the future
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39415/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39414
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39414/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39414/comments
https://api.github.com/repos/huggingface/transformers/issues/39414/events
https://github.com/huggingface/transformers/pull/39414
3,231,031,605
PR_kwDOCUB6oc6e7E20
39,414
Fix typo in `/v1/models` output payload
{ "login": "alvarobartt", "id": 36760800, "node_id": "MDQ6VXNlcjM2NzYwODAw", "avatar_url": "https://avatars.githubusercontent.com/u/36760800?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alvarobartt", "html_url": "https://github.com/alvarobartt", "followers_url": "https://api.github.com/users/alvarobartt/followers", "following_url": "https://api.github.com/users/alvarobartt/following{/other_user}", "gists_url": "https://api.github.com/users/alvarobartt/gists{/gist_id}", "starred_url": "https://api.github.com/users/alvarobartt/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alvarobartt/subscriptions", "organizations_url": "https://api.github.com/users/alvarobartt/orgs", "repos_url": "https://api.github.com/users/alvarobartt/repos", "events_url": "https://api.github.com/users/alvarobartt/events{/privacy}", "received_events_url": "https://api.github.com/users/alvarobartt/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T06:51:28
2025-07-15T07:59:25
2025-07-15T07:59:25
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39414", "html_url": "https://github.com/huggingface/transformers/pull/39414", "diff_url": "https://github.com/huggingface/transformers/pull/39414.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39414.patch", "merged_at": "2025-07-15T07:59:25" }
# What does this PR do? This PR fixes a typo affecting the output payload of the `/v1/models` endpoint in `transformers serve`. ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a GitHub issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @ArthurZucker or @gante
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39414/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39414/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39413
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39413/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39413/comments
https://api.github.com/repos/huggingface/transformers/issues/39413/events
https://github.com/huggingface/transformers/issues/39413
3,230,706,385
I_kwDOCUB6oc7AkKrR
39,413
Exeception 3 type mismatch
{ "login": "Smirkkkk", "id": 73322513, "node_id": "MDQ6VXNlcjczMzIyNTEz", "avatar_url": "https://avatars.githubusercontent.com/u/73322513?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Smirkkkk", "html_url": "https://github.com/Smirkkkk", "followers_url": "https://api.github.com/users/Smirkkkk/followers", "following_url": "https://api.github.com/users/Smirkkkk/following{/other_user}", "gists_url": "https://api.github.com/users/Smirkkkk/gists{/gist_id}", "starred_url": "https://api.github.com/users/Smirkkkk/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Smirkkkk/subscriptions", "organizations_url": "https://api.github.com/users/Smirkkkk/orgs", "repos_url": "https://api.github.com/users/Smirkkkk/repos", "events_url": "https://api.github.com/users/Smirkkkk/events{/privacy}", "received_events_url": "https://api.github.com/users/Smirkkkk/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-15T04:22:44
2025-08-22T08:02:55
2025-08-22T08:02:55
NONE
null
null
null
null
### System Info [https://github.com/huggingface/transformers/blob/6017f5e8ed33d48096cdf8630d1cc7cbf2550c90/src/transformers/generation/utils.py#L482](url) The type of cache_position[-1] and input_ids.shape[1] do not match to compare. One is tensor and the other is int ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction [https://github.com/huggingface/transformers/blob/6017f5e8ed33d48096cdf8630d1cc7cbf2550c90/src/transformers/generation/utils.py#L482](url) Just the exception 3 ### Expected behavior TypeError
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39413/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39413/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39412
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39412/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39412/comments
https://api.github.com/repos/huggingface/transformers/issues/39412/events
https://github.com/huggingface/transformers/pull/39412
3,230,606,872
PR_kwDOCUB6oc6e5olp
39,412
use the enable_gqa param in torch.nn.functional.scaled_dot_product_at…
{ "login": "sywangyi", "id": 36058628, "node_id": "MDQ6VXNlcjM2MDU4NjI4", "avatar_url": "https://avatars.githubusercontent.com/u/36058628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sywangyi", "html_url": "https://github.com/sywangyi", "followers_url": "https://api.github.com/users/sywangyi/followers", "following_url": "https://api.github.com/users/sywangyi/following{/other_user}", "gists_url": "https://api.github.com/users/sywangyi/gists{/gist_id}", "starred_url": "https://api.github.com/users/sywangyi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sywangyi/subscriptions", "organizations_url": "https://api.github.com/users/sywangyi/orgs", "repos_url": "https://api.github.com/users/sywangyi/repos", "events_url": "https://api.github.com/users/sywangyi/events{/privacy}", "received_events_url": "https://api.github.com/users/sywangyi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T03:10:52
2025-07-22T23:43:26
2025-07-21T12:46:43
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39412", "html_url": "https://github.com/huggingface/transformers/pull/39412", "diff_url": "https://github.com/huggingface/transformers/pull/39412.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39412.patch", "merged_at": "2025-07-21T12:46:43" }
…tention the GQA could be accelerated in torch.nn.functional.scaled_dot_product_attention. this pytorch api offer a param to enable gqa. see https://docs.pytorch.org/docs/2.7/generated/torch.nn.functional.scaled_dot_product_attention.html#torch-nn-functional-scaled-dot-product-attention
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39412/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39412/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39411
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39411/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39411/comments
https://api.github.com/repos/huggingface/transformers/issues/39411/events
https://github.com/huggingface/transformers/pull/39411
3,230,536,562
PR_kwDOCUB6oc6e5ZtG
39,411
set document_question_answering pipeline _load_tokenizer to True
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/jiqing-feng/followers", "following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}", "gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions", "organizations_url": "https://api.github.com/users/jiqing-feng/orgs", "repos_url": "https://api.github.com/users/jiqing-feng/repos", "events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}", "received_events_url": "https://api.github.com/users/jiqing-feng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T02:22:40
2025-07-15T12:06:30
2025-07-15T12:05:50
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39411", "html_url": "https://github.com/huggingface/transformers/pull/39411", "diff_url": "https://github.com/huggingface/transformers/pull/39411.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39411.patch", "merged_at": "2025-07-15T12:05:50" }
Hi @Rocketknight1 . The document-question-answering pipeline requires a tokenizer in the preprocessing step. script: [impira/layoutlm-document-qa](https://huggingface.co/impira/layoutlm-document-qa) ```python from transformers import pipeline nlp = pipeline( "document-question-answering", model="impira/layoutlm-document-qa", ) nlp( "https://templates.invoicehome.com/invoice-template-us-neat-750px.png", "What is the invoice number?" ) ``` error: ``` Traceback (most recent call last): File "/home/jiqing/transformers/test_dqa.py", line 8, in <module> nlp( File "/home/jiqing/transformers/src/transformers/pipelines/document_question_answering.py", line 308, in __call__ return super().__call__(inputs, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/base.py", line 1450, in __call__ return next( ^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/pt_utils.py", line 124, in __next__ item = next(self.iterator) ^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/pt_utils.py", line 269, in __next__ processed = self.infer(next(self.iterator), **self.params) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/utils/data/dataloader.py", line 734, in __next__ data = self._next_data() ^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/utils/data/dataloader.py", line 790, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/torch/utils/data/_utils/fetch.py", line 33, in fetch data.append(next(self.dataset_iter)) ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/pt_utils.py", line 186, in __next__ processed = next(self.subiterator) ^^^^^^^^^^^^^^^^^^^^^^ File "/home/jiqing/transformers/src/transformers/pipelines/document_question_answering.py", line 324, in preprocess max_seq_len = self.tokenizer.model_max_length ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'model_max_length' ``` Please review this PR. Thanks!
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39411/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39411/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39410
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39410/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39410/comments
https://api.github.com/repos/huggingface/transformers/issues/39410/events
https://github.com/huggingface/transformers/issues/39410
3,230,522,847
I_kwDOCUB6oc7Ajd3f
39,410
FP8 training support for Model Parallel / Tensor Parallel (MP/TP)
{ "login": "edgeinfinity1", "id": 133833590, "node_id": "U_kgDOB_ojdg", "avatar_url": "https://avatars.githubusercontent.com/u/133833590?v=4", "gravatar_id": "", "url": "https://api.github.com/users/edgeinfinity1", "html_url": "https://github.com/edgeinfinity1", "followers_url": "https://api.github.com/users/edgeinfinity1/followers", "following_url": "https://api.github.com/users/edgeinfinity1/following{/other_user}", "gists_url": "https://api.github.com/users/edgeinfinity1/gists{/gist_id}", "starred_url": "https://api.github.com/users/edgeinfinity1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/edgeinfinity1/subscriptions", "organizations_url": "https://api.github.com/users/edgeinfinity1/orgs", "repos_url": "https://api.github.com/users/edgeinfinity1/repos", "events_url": "https://api.github.com/users/edgeinfinity1/events{/privacy}", "received_events_url": "https://api.github.com/users/edgeinfinity1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-07-15T02:13:05
2025-07-15T13:30:27
null
NONE
null
null
null
null
### Feature request I recieve message "ValueError: The model you are trying to fine-tune is quantized with QuantizationMethod.FP8 but that quantization method do not support training. Please open an issue on GitHub: https://github.com/huggingface/transformers to request the support for training support for QuantizationMethod.FP8" when trying to finetune a fp8 model. I have learned from the documentations that fp8 models can be trained with ddp, zero or fsdp. Is there a way to do it with MP/TP for huge fp8 models? ### Motivation Enable finetuning huge fp8 models, like Qwen/Qwen3-235B-A22B-FP8 ### Your contribution I'm afraid it's too tough for me, but I'll do whatever I can if you need.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39410/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39410/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39409
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39409/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39409/comments
https://api.github.com/repos/huggingface/transformers/issues/39409/events
https://github.com/huggingface/transformers/issues/39409
3,230,492,239
I_kwDOCUB6oc7AjWZP
39,409
TypeError: couldn't find storage object Float8_e4m3fnStorage - which version is needed for this?
{ "login": "FurkanGozukara", "id": 19240467, "node_id": "MDQ6VXNlcjE5MjQwNDY3", "avatar_url": "https://avatars.githubusercontent.com/u/19240467?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FurkanGozukara", "html_url": "https://github.com/FurkanGozukara", "followers_url": "https://api.github.com/users/FurkanGozukara/followers", "following_url": "https://api.github.com/users/FurkanGozukara/following{/other_user}", "gists_url": "https://api.github.com/users/FurkanGozukara/gists{/gist_id}", "starred_url": "https://api.github.com/users/FurkanGozukara/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FurkanGozukara/subscriptions", "organizations_url": "https://api.github.com/users/FurkanGozukara/orgs", "repos_url": "https://api.github.com/users/FurkanGozukara/repos", "events_url": "https://api.github.com/users/FurkanGozukara/events{/privacy}", "received_events_url": "https://api.github.com/users/FurkanGozukara/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-15T01:51:08
2025-08-02T12:06:59
2025-08-02T12:06:59
NONE
null
null
null
null
Tested so many versions but can't find a version that won't give this error ``` !pip install bitsandbytes==0.45.0 --upgrade !pip install insightface --upgrade !pip install huggingface_hub==0.25.1 hf_transfer diffusers==0.31.0 transformers==4.36.0 !pip uninstall xformers triton --yes !pip install torch==2.2.0+cu121 torchvision --index-url https://download.pytorch.org/whl/cu121 !pip install xformers==0.0.24 --index-url https://download.pytorch.org/whl/cu121 ``` ``` File "/kaggle/temp/InstantID/gradio_demo/web-ui-multicontrolnet.py", line 975, in generate_image reload_pipe(model_input, model_dropdown, scheduler, adapter_strength_ratio, enable_LCM, depth_type, lora_model_dropdown, lora_scale,test_all_loras,single_lora) File "/kaggle/temp/InstantID/gradio_demo/web-ui-multicontrolnet.py", line 654, in reload_pipe pipe = load_model(_pretrained_model_folder, model_to_load) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/kaggle/temp/InstantID/gradio_demo/web-ui-multicontrolnet.py", line 528, in load_model pipeline = StableDiffusionPipeline.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/diffusers/pipelines/pipeline_utils.py", line 896, in from_pretrained loaded_sub_model = load_sub_model( ^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/diffusers/pipelines/pipeline_loading_utils.py", line 704, in load_sub_model loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 4027, in from_pretrained dtype_orig = cls._set_default_torch_dtype(torch_dtype) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 1584, in _set_default_torch_dtype torch.set_default_dtype(dtype) File "/usr/local/lib/python3.11/dist-packages/torch/__init__.py", line 1009, in set_default_dtype _C._set_default_dtype(d) TypeError: couldn't find storage object Float8_e4m3fnStorage ```
{ "login": "FurkanGozukara", "id": 19240467, "node_id": "MDQ6VXNlcjE5MjQwNDY3", "avatar_url": "https://avatars.githubusercontent.com/u/19240467?v=4", "gravatar_id": "", "url": "https://api.github.com/users/FurkanGozukara", "html_url": "https://github.com/FurkanGozukara", "followers_url": "https://api.github.com/users/FurkanGozukara/followers", "following_url": "https://api.github.com/users/FurkanGozukara/following{/other_user}", "gists_url": "https://api.github.com/users/FurkanGozukara/gists{/gist_id}", "starred_url": "https://api.github.com/users/FurkanGozukara/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FurkanGozukara/subscriptions", "organizations_url": "https://api.github.com/users/FurkanGozukara/orgs", "repos_url": "https://api.github.com/users/FurkanGozukara/repos", "events_url": "https://api.github.com/users/FurkanGozukara/events{/privacy}", "received_events_url": "https://api.github.com/users/FurkanGozukara/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39409/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39409/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39408
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39408/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39408/comments
https://api.github.com/repos/huggingface/transformers/issues/39408/events
https://github.com/huggingface/transformers/issues/39408
3,230,357,763
I_kwDOCUB6oc7Ai1kD
39,408
Off-by-one error when using flash_attention with a sliding window
{ "login": "tyler-romero", "id": 22687428, "node_id": "MDQ6VXNlcjIyNjg3NDI4", "avatar_url": "https://avatars.githubusercontent.com/u/22687428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tyler-romero", "html_url": "https://github.com/tyler-romero", "followers_url": "https://api.github.com/users/tyler-romero/followers", "following_url": "https://api.github.com/users/tyler-romero/following{/other_user}", "gists_url": "https://api.github.com/users/tyler-romero/gists{/gist_id}", "starred_url": "https://api.github.com/users/tyler-romero/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tyler-romero/subscriptions", "organizations_url": "https://api.github.com/users/tyler-romero/orgs", "repos_url": "https://api.github.com/users/tyler-romero/repos", "events_url": "https://api.github.com/users/tyler-romero/events{/privacy}", "received_events_url": "https://api.github.com/users/tyler-romero/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null }, { "id": 6202871275, "node_id": "LA_kwDOCUB6oc8AAAABcbhN6w", "url": "https://api.github.com/repos/huggingface/transformers/labels/Flash%20Attention", "name": "Flash Attention", "color": "201FF8", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-15T00:20:55
2025-08-20T12:23:15
2025-08-20T12:23:15
NONE
null
null
null
null
### System Info Looking at https://github.com/huggingface/transformers/commit/6017f5e8ed33d48096cdf8630d1cc7cbf2550c90 ### Who can help? @ArthurZucker ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction From the [flash attention documentation](https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/flash_attn_interface.py#L1016) for the window_size argument: ``` If window_size != (-1, -1), implements sliding window local attention. Query at position i will only attend to keys between [i - window_size[0], i + window_size[1]] inclusive. ``` Transformers [currently sets window_size size like so](https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_flash_attention_utils.py#L489) when using flash attention : ``` flash_kwargs = {"window_size": (sliding_window, sliding_window)} if use_sliding_windows else {} ``` This means that from a given token, we will look backward by `sliding_window` and then forward by `sliding_window`, resulting in a total window size of `2 * sliding_window + 1`. If causal masking is applied, then it is equivalent to setting `(sliding_window, 0)` and the total window size is `sliding_window + 1`. ### Expected behavior The expected behavior is for the flash attention implementation of sliding window attention to match the other implementations, which currently use [this function](https://github.com/huggingface/transformers/blob/6017f5e8ed33d48096cdf8630d1cc7cbf2550c90/src/transformers/masking_utils.py#L76C1-L85C22) to apply a sliding window (note that transformer's flash attention usage explicitly bypasses modifying an attention mask with this function): ``` def sliding_window_overlay(sliding_window: int) -> Callable: """ This is an overlay depicting a sliding window pattern. Add it on top of a causal mask for a proper sliding window mask. """ def inner_mask(batch_idx: int, head_idx: int, q_idx: int, kv_idx: int) -> bool: return kv_idx > q_idx - sliding_window return inner_mask ``` When combined with a causal mask as intended, this function retains `sliding_window - 1` tokens to the left, plus the current token for a total window size of `sliding_window`. Suggested fix: ``` flash_kwargs = {"window_size": (sliding_window - 1, sliding_window)} if use_sliding_windows else {} ```
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39408/reactions", "total_count": 3, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/huggingface/transformers/issues/39408/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39407
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39407/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39407/comments
https://api.github.com/repos/huggingface/transformers/issues/39407/events
https://github.com/huggingface/transformers/pull/39407
3,230,344,085
PR_kwDOCUB6oc6e4xBO
39,407
docs: update LightGlue docs
{ "login": "sbucaille", "id": 24275548, "node_id": "MDQ6VXNlcjI0Mjc1NTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sbucaille", "html_url": "https://github.com/sbucaille", "followers_url": "https://api.github.com/users/sbucaille/followers", "following_url": "https://api.github.com/users/sbucaille/following{/other_user}", "gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}", "starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions", "organizations_url": "https://api.github.com/users/sbucaille/orgs", "repos_url": "https://api.github.com/users/sbucaille/repos", "events_url": "https://api.github.com/users/sbucaille/events{/privacy}", "received_events_url": "https://api.github.com/users/sbucaille/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T00:11:21
2025-07-15T19:46:04
2025-07-15T19:40:51
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39407", "html_url": "https://github.com/huggingface/transformers/pull/39407", "diff_url": "https://github.com/huggingface/transformers/pull/39407.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39407.patch", "merged_at": "2025-07-15T19:40:51" }
# What does this PR do? Updates LightGlue model card for #36979 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). ## Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39407/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39407/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39406
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39406/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39406/comments
https://api.github.com/repos/huggingface/transformers/issues/39406/events
https://github.com/huggingface/transformers/pull/39406
3,230,343,606
PR_kwDOCUB6oc6e4w6b
39,406
docs: update SuperGlue docs
{ "login": "sbucaille", "id": 24275548, "node_id": "MDQ6VXNlcjI0Mjc1NTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sbucaille", "html_url": "https://github.com/sbucaille", "followers_url": "https://api.github.com/users/sbucaille/followers", "following_url": "https://api.github.com/users/sbucaille/following{/other_user}", "gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}", "starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions", "organizations_url": "https://api.github.com/users/sbucaille/orgs", "repos_url": "https://api.github.com/users/sbucaille/repos", "events_url": "https://api.github.com/users/sbucaille/events{/privacy}", "received_events_url": "https://api.github.com/users/sbucaille/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-15T00:11:01
2025-07-15T19:45:42
2025-07-15T19:40:26
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39406", "html_url": "https://github.com/huggingface/transformers/pull/39406", "diff_url": "https://github.com/huggingface/transformers/pull/39406.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39406.patch", "merged_at": "2025-07-15T19:40:26" }
# What does this PR do? Updates SuperGlue model card for #36979 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). ## Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39406/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39406/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39405
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39405/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39405/comments
https://api.github.com/repos/huggingface/transformers/issues/39405/events
https://github.com/huggingface/transformers/issues/39405
3,230,150,968
I_kwDOCUB6oc7AiDE4
39,405
breaking changes in ESM model classes
{ "login": "adrienchaton", "id": 35500385, "node_id": "MDQ6VXNlcjM1NTAwMzg1", "avatar_url": "https://avatars.githubusercontent.com/u/35500385?v=4", "gravatar_id": "", "url": "https://api.github.com/users/adrienchaton", "html_url": "https://github.com/adrienchaton", "followers_url": "https://api.github.com/users/adrienchaton/followers", "following_url": "https://api.github.com/users/adrienchaton/following{/other_user}", "gists_url": "https://api.github.com/users/adrienchaton/gists{/gist_id}", "starred_url": "https://api.github.com/users/adrienchaton/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/adrienchaton/subscriptions", "organizations_url": "https://api.github.com/users/adrienchaton/orgs", "repos_url": "https://api.github.com/users/adrienchaton/repos", "events_url": "https://api.github.com/users/adrienchaton/events{/privacy}", "received_events_url": "https://api.github.com/users/adrienchaton/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-14T22:01:12
2025-07-17T14:23:01
2025-07-17T14:23:01
NONE
null
null
null
null
### System Info Hello, I had finetuned a model based on ESM class https://huggingface.co/facebook/esm2_t30_150M_UR50D at the time I had ```transformers.__version__ == '4.38.1'``` With this version, if I run the common import command ``` # Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("facebook/esm2_t30_150M_UR50D") model = AutoModelForMaskedLM.from_pretrained("facebook/esm2_t30_150M_UR50D") ``` it doesnt raise any warning and the architecture is as ``` print(model.esm.embeddings) EsmEmbeddings( (word_embeddings): Embedding(33, 640, padding_idx=1) (dropout): Dropout(p=0.0, inplace=False) (position_embeddings): Embedding(1026, 640, padding_idx=1) ) ``` However, if I try to load my trained model with an updated version of huggingface, e.g. ```transformers.__version__ == '4.53.2'``` it triggered errors ``` Unexpected key(s) in state_dict: "model.esm.embeddings.position_embeddings.weight". ``` and I found that the updated class had breaking changes ``` print(model.esm.embeddings) EsmEmbeddings( (word_embeddings): Embedding(33, 640, padding_idx=1) (dropout): Dropout(p=0.0, inplace=False) ) ``` the position embedding layer has been removed, it also causes warning when loading the original checkpoints ``` # Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("facebook/esm2_t30_150M_UR50D") model = AutoModelForMaskedLM.from_pretrained("facebook/esm2_t30_150M_UR50D") Some weights of the model checkpoint at facebook/esm2_t30_150M_UR50D were not used when initializing EsmForMaskedLM: ['esm.embeddings.position_embeddings.weight'] - This IS expected if you are initializing EsmForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing EsmForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). ``` So currently it seems that the checkpoints in the model hub and the transformers class arent matching anymore, and in the same way, the checkpoints I finetuned earlier cant be loaded correctly in the updated ESM class. How shall the ```position_embeddings``` weight be provided to the updated ESM class please? ### Who can help? @ArthurZucker ### Information - [x] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction install ```transformers.__version__ == '4.53.2'``` run the example from https://huggingface.co/facebook/esm2_t30_150M_UR50D?library=transformers model loading triggers ``` Some weights of the model checkpoint at facebook/esm2_t30_150M_UR50D were not used when initializing EsmForMaskedLM: ['esm.embeddings.position_embeddings.weight'] - This IS expected if you are initializing EsmForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing EsmForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). ``` ### Expected behavior The official pretrained checkpoint (or e.g. some of my finetuned checkpoints from earlier transformers versions) can load into the new class without bypassing the ```position_embeddings``` weight.
{ "login": "adrienchaton", "id": 35500385, "node_id": "MDQ6VXNlcjM1NTAwMzg1", "avatar_url": "https://avatars.githubusercontent.com/u/35500385?v=4", "gravatar_id": "", "url": "https://api.github.com/users/adrienchaton", "html_url": "https://github.com/adrienchaton", "followers_url": "https://api.github.com/users/adrienchaton/followers", "following_url": "https://api.github.com/users/adrienchaton/following{/other_user}", "gists_url": "https://api.github.com/users/adrienchaton/gists{/gist_id}", "starred_url": "https://api.github.com/users/adrienchaton/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/adrienchaton/subscriptions", "organizations_url": "https://api.github.com/users/adrienchaton/orgs", "repos_url": "https://api.github.com/users/adrienchaton/repos", "events_url": "https://api.github.com/users/adrienchaton/events{/privacy}", "received_events_url": "https://api.github.com/users/adrienchaton/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39405/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39404
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39404/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39404/comments
https://api.github.com/repos/huggingface/transformers/issues/39404/events
https://github.com/huggingface/transformers/issues/39404
3,229,815,847
I_kwDOCUB6oc7AgxQn
39,404
Whisper `return_language` with pipeline no longer working
{ "login": "Metric-Void", "id": 21335640, "node_id": "MDQ6VXNlcjIxMzM1NjQw", "avatar_url": "https://avatars.githubusercontent.com/u/21335640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Metric-Void", "html_url": "https://github.com/Metric-Void", "followers_url": "https://api.github.com/users/Metric-Void/followers", "following_url": "https://api.github.com/users/Metric-Void/following{/other_user}", "gists_url": "https://api.github.com/users/Metric-Void/gists{/gist_id}", "starred_url": "https://api.github.com/users/Metric-Void/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Metric-Void/subscriptions", "organizations_url": "https://api.github.com/users/Metric-Void/orgs", "repos_url": "https://api.github.com/users/Metric-Void/repos", "events_url": "https://api.github.com/users/Metric-Void/events{/privacy}", "received_events_url": "https://api.github.com/users/Metric-Void/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null }, { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
open
false
{ "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
[]
2025-07-14T19:36:46
2025-10-06T17:08:19
null
NONE
null
null
null
null
### System Info Platform: Initially discovered on Nvidia. Can be reproduced on CPU and in Google Colab (see attached gist). - `transformers` version: 4.53.2 - Platform: Linux-6.6.87.2-microsoft-standard-WSL2-x86_64-with-glibc2.39 - Python version: 3.12.3 - Huggingface_hub version: 0.33.4 - Safetensors version: 0.5.3 - Accelerate version: 1.8.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.7.1+cu126 (CUDA) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: No - Using GPU in script?: Yes and No. - GPU type: NVIDIA GeForce RTX 3090 ### Who can help? @eustlb @ArthurZucker ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction <s>Sometime between `transformers==4.46.3` and `transfomers==4.53.2 (latest as of now)`,</s> At #34135, the `return_language` argument for pipeline stopped working. The ending timestamp for the last word is also missing. Example (exported from Google Colab): https://gist.github.com/Metric-Void/ce2b9fe2faed0cdf6e5fd328599fd4c7 Code for testing: ``` import torch from transformers import pipeline from transformers.configuration_utils import PretrainedConfig pipeline = pipeline( task="automatic-speech-recognition", model="openai/whisper-tiny", torch_dtype=torch.float16, config=PretrainedConfig( attn_implementation="flash_attention_2" ) ) result = pipeline("https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac", return_language=True, return_timestamps='word') result["chunks"] ``` Before (`transformers==4.46.3`): ``` [{'text': ' I', 'timestamp': (1.04, 1.36), 'language': 'english'}, {'text': ' have', 'timestamp': (1.36, 1.68), 'language': 'english'}, {'text': ' a', 'timestamp': (1.68, 1.94), 'language': 'english'}, {'text': ' dream.', 'timestamp': (1.94, 3.82), 'language': 'english'}, {'text': ' Good', 'timestamp': (3.82, 3.98), 'language': 'english'}, {'text': ' one', 'timestamp': (3.98, 4.16), 'language': 'english'}, {'text': ' day.', 'timestamp': (4.16, 6.4), 'language': 'english'}, {'text': ' This', 'timestamp': (6.4, 6.58), 'language': 'english'}, {'text': ' nation', 'timestamp': (6.58, 7.24), 'language': 'english'}, {'text': ' will', 'timestamp': (7.24, 7.82), 'language': 'english'}, {'text': ' rise', 'timestamp': (7.82, 8.3), 'language': 'english'}, {'text': ' up.', 'timestamp': (8.3, 10.3), 'language': 'english'}, {'text': ' Live', 'timestamp': (10.3, 10.56), 'language': 'english'}, {'text': ' out', 'timestamp': (10.56, 10.98), 'language': 'english'}, {'text': ' the', 'timestamp': (10.98, 11.02), 'language': 'english'}, {'text': ' true', 'timestamp': (11.02, 11.3), 'language': 'english'}, {'text': ' meaning', 'timestamp': (11.3, 11.6), 'language': 'english'}, {'text': ' of', 'timestamp': (11.6, 11.86), 'language': 'english'}, {'text': ' its', 'timestamp': (11.86, 12.08), 'language': 'english'}, {'text': ' dream.', 'timestamp': (12.08, 12.98), 'language': 'english'}] ``` After (`transfomers==4.53.2`): ``` [{'text': ' I', 'timestamp': (1.04, 1.36), 'language': None}, {'text': ' have', 'timestamp': (1.36, 1.68), 'language': None}, {'text': ' a', 'timestamp': (1.68, 1.94), 'language': None}, {'text': ' dream.', 'timestamp': (1.94, 3.82), 'language': None}, {'text': ' But', 'timestamp': (3.82, 3.96), 'language': None}, {'text': ' one', 'timestamp': (3.96, 4.18), 'language': None}, {'text': ' day,', 'timestamp': (4.18, 6.22), 'language': None}, {'text': ' this', 'timestamp': (6.22, 6.58), 'language': None}, {'text': ' nation', 'timestamp': (6.58, 7.22), 'language': None}, {'text': ' will', 'timestamp': (7.22, 7.82), 'language': None}, {'text': ' rise', 'timestamp': (7.82, 8.3), 'language': None}, {'text': ' up,', 'timestamp': (8.3, 10.2), 'language': None}, {'text': ' live', 'timestamp': (10.2, 10.56), 'language': None}, {'text': ' out', 'timestamp': (10.56, 10.98), 'language': None}, {'text': ' the', 'timestamp': (10.98, 11.02), 'language': None}, {'text': ' true', 'timestamp': (11.02, 11.3), 'language': None}, {'text': ' meaning', 'timestamp': (11.3, 11.6), 'language': None}, {'text': ' of', 'timestamp': (11.6, 11.86), 'language': None}, {'text': ' its', 'timestamp': (11.86, 12.08), 'language': None}, {'text': ' dream.', 'timestamp': (12.08, None), 'language': None}] ``` ### Expected behavior The old behaviour was correct. Maybe related: #21311, #21427, #25138, #27604, #29520, #31572
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39404/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39404/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39403
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39403/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39403/comments
https://api.github.com/repos/huggingface/transformers/issues/39403/events
https://github.com/huggingface/transformers/pull/39403
3,229,651,328
PR_kwDOCUB6oc6e2bKC
39,403
Add Vocos model
{ "login": "Manalelaidouni", "id": 25346345, "node_id": "MDQ6VXNlcjI1MzQ2MzQ1", "avatar_url": "https://avatars.githubusercontent.com/u/25346345?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Manalelaidouni", "html_url": "https://github.com/Manalelaidouni", "followers_url": "https://api.github.com/users/Manalelaidouni/followers", "following_url": "https://api.github.com/users/Manalelaidouni/following{/other_user}", "gists_url": "https://api.github.com/users/Manalelaidouni/gists{/gist_id}", "starred_url": "https://api.github.com/users/Manalelaidouni/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Manalelaidouni/subscriptions", "organizations_url": "https://api.github.com/users/Manalelaidouni/orgs", "repos_url": "https://api.github.com/users/Manalelaidouni/repos", "events_url": "https://api.github.com/users/Manalelaidouni/events{/privacy}", "received_events_url": "https://api.github.com/users/Manalelaidouni/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" }, { "id": 6470596964, "node_id": "LA_kwDOCUB6oc8AAAABga15ZA", "url": "https://api.github.com/repos/huggingface/transformers/labels/Audio", "name": "Audio", "color": "760453", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-07-14T18:25:37
2025-10-29T20:35:35
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39403", "html_url": "https://github.com/huggingface/transformers/pull/39403", "diff_url": "https://github.com/huggingface/transformers/pull/39403.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39403.patch", "merged_at": null }
# What does this PR do? This PR aims at integrating `Vocos` model to `transformers`. Vocos is a neural vocoder designed for high quality audio synthesis in TTS pipelines and related tasks, outpeforms `HifiGan` and it is significantly faster. It has 2 main variants : - `VocosModel` can be used as a standalone vocoder in audio generation pipeline, the goal is to use it as a drop in vocoder in YuE model. It can also be used together with `VocosFeatureExtractor` to synthesis audio from mel-spectrogram features. - `VocosWithEncodecModel` : integrates the EnCodec neural audio codec model into Vocos for end-to-end audio compression and reconstruction. This is a continuation of integrating model components for the new YuE model (mention in #36784). ## Who can review? Anyone in the community is free to review the PR once the tests have passed. @ArthurZucker @eustlb @ylacombe
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39403/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39403/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39402
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39402/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39402/comments
https://api.github.com/repos/huggingface/transformers/issues/39402/events
https://github.com/huggingface/transformers/pull/39402
3,229,550,573
PR_kwDOCUB6oc6e2Ev7
39,402
No repeat kv
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-07-14T17:51:23
2025-07-16T15:42:24
null
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39402", "html_url": "https://github.com/huggingface/transformers/pull/39402", "diff_url": "https://github.com/huggingface/transformers/pull/39402.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39402.patch", "merged_at": null }
Draft PR for now, please don't merge! The goal of this PR is to experiment with an `einsum` approach for `eager_attention_forward` that avoids a lot of reshapes and transposes that introduce intermediate tensors and unnecessary copies. In notebook benchmarks it seems to be 25-50% faster, but I'm making a proper PR so I can test more thoroughly.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39402/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/huggingface/transformers/issues/39402/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39401
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39401/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39401/comments
https://api.github.com/repos/huggingface/transformers/issues/39401/events
https://github.com/huggingface/transformers/issues/39401
3,228,950,168
I_kwDOCUB6oc7Add6Y
39,401
Qwen3 tokenizer wrong offset_mapping
{ "login": "contribcode", "id": 24355946, "node_id": "MDQ6VXNlcjI0MzU1OTQ2", "avatar_url": "https://avatars.githubusercontent.com/u/24355946?v=4", "gravatar_id": "", "url": "https://api.github.com/users/contribcode", "html_url": "https://github.com/contribcode", "followers_url": "https://api.github.com/users/contribcode/followers", "following_url": "https://api.github.com/users/contribcode/following{/other_user}", "gists_url": "https://api.github.com/users/contribcode/gists{/gist_id}", "starred_url": "https://api.github.com/users/contribcode/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/contribcode/subscriptions", "organizations_url": "https://api.github.com/users/contribcode/orgs", "repos_url": "https://api.github.com/users/contribcode/repos", "events_url": "https://api.github.com/users/contribcode/events{/privacy}", "received_events_url": "https://api.github.com/users/contribcode/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-14T14:21:08
2025-07-16T09:59:35
2025-07-16T09:59:35
NONE
null
null
null
null
### System Info transformers 4.53.2, Ubuntu 22.04.4, python 3.11.13 ### Who can help? @ArthurZucker and @itazap There must be a problem with the `offset_mapping` of Qwen3 `tokenizer`. The starting point in the text for each token, except the first and the last, is one position behind. I compared it with the BERT's `tokenizer`, which produces what is expected: ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction ``` sample_text='A girl is styling her hair.' bert_tokenizer = BertTokenizerFast.from_pretrained('google-bert/bert-base-cased') bert_encoding = bert_tokenizer( text=sample_text, add_special_tokens=False, return_offsets_mapping=True ) print(bert_encoding['offset_mapping']) qwen_tokenizer = AutoTokenizer.from_pretrained('Qwen/Qwen3-Embedding-0.6B') qwen_encoding = qwen_tokenizer( text=sample_text, add_special_tokens=False, return_offsets_mapping=True ) print(qwen_encoding['offset_mapping']) ``` ### Expected behavior [(0, 1), (2, 6), (7, 9), (10, 17), (18, 21), (22, 26), (26, 27)] [(0, 1), (1, 6), (6, 9), (9, 17), (17, 21), (21, 26), (26, 27)]
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39401/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39401/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39400
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39400/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39400/comments
https://api.github.com/repos/huggingface/transformers/issues/39400/events
https://github.com/huggingface/transformers/issues/39400
3,228,573,689
I_kwDOCUB6oc7AcB_5
39,400
Causal mask is not compatible with Qwen2-VL when using padding-free training
{ "login": "hiyouga", "id": 16256802, "node_id": "MDQ6VXNlcjE2MjU2ODAy", "avatar_url": "https://avatars.githubusercontent.com/u/16256802?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hiyouga", "html_url": "https://github.com/hiyouga", "followers_url": "https://api.github.com/users/hiyouga/followers", "following_url": "https://api.github.com/users/hiyouga/following{/other_user}", "gists_url": "https://api.github.com/users/hiyouga/gists{/gist_id}", "starred_url": "https://api.github.com/users/hiyouga/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hiyouga/subscriptions", "organizations_url": "https://api.github.com/users/hiyouga/orgs", "repos_url": "https://api.github.com/users/hiyouga/repos", "events_url": "https://api.github.com/users/hiyouga/events{/privacy}", "received_events_url": "https://api.github.com/users/hiyouga/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-14T12:21:16
2025-07-21T10:19:16
2025-07-21T10:19:16
CONTRIBUTOR
null
null
null
null
### System Info transformers @ 4.53.2 ### Who can help? @amyeroberts @zucchini-nlp ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction https://github.com/huggingface/transformers/blob/903944a411c35b8f7b2b51ada77a7c2a80e7fb88/src/transformers/masking_utils.py#L708-L713 Qwen2-VL's position ids has a shape of `[3, batch_batch, seq_len]`, while the current code cannot process the 3D position ids. https://github.com/hiyouga/EasyR1/issues/413 ### Expected behavior There should be a special case for Qwen2-VL's 3D MRope
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39400/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39400/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39399
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39399/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39399/comments
https://api.github.com/repos/huggingface/transformers/issues/39399/events
https://github.com/huggingface/transformers/issues/39399
3,228,495,991
I_kwDOCUB6oc7AbvB3
39,399
Qwen2.5-VL Sharding error when using Tensor Parallelism
{ "login": "rupert404", "id": 52383375, "node_id": "MDQ6VXNlcjUyMzgzMzc1", "avatar_url": "https://avatars.githubusercontent.com/u/52383375?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rupert404", "html_url": "https://github.com/rupert404", "followers_url": "https://api.github.com/users/rupert404/followers", "following_url": "https://api.github.com/users/rupert404/following{/other_user}", "gists_url": "https://api.github.com/users/rupert404/gists{/gist_id}", "starred_url": "https://api.github.com/users/rupert404/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rupert404/subscriptions", "organizations_url": "https://api.github.com/users/rupert404/orgs", "repos_url": "https://api.github.com/users/rupert404/repos", "events_url": "https://api.github.com/users/rupert404/events{/privacy}", "received_events_url": "https://api.github.com/users/rupert404/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-14T11:54:01
2025-09-30T23:07:34
2025-09-17T08:03:10
NONE
null
null
null
null
### System Info - transformers version: 4.53.2 - Platform: Linux-6.11.0-29-generic-x86_64-with-glibc2.39 - Python version: 3.12.3 - Huggingface_hub version: 0.33.2 - Safetensors version: 0.5.3 - Accelerate version: 1.8.1 - PyTorch version: 2.7.1+cu128 (CUDA) - Using distributed or parallel set-up in script?: Yes, Tensor Parallelism via PyTorch Elastic (torchrun) - Using GPU in script?: Yes - Tested GPU types: - 1x/2x/4xA100 - 1xRTX4090 - 1x5090 ### Who can help? @zucchini-nlp @jla524 @VladOS95-cyber (Tagging because of related commit-history) ### Information - [x] The official example scripts - [x] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [x] My own task or dataset (give details below) ### Reproduction When adapting the [example script for distributed GPU inference from the documentation](https://huggingface.co/docs/transformers/perf_infer_gpu_multi) using tensor parallelism for the Qwen-2.5-VL-family, the following errors arise. ``` [rank0]: Traceback (most recent call last): [rank0]: File "/home/rupert/qwen2.5-vl-test/.venv/lib/python3.12/site-packages/torch/distributed/tensor/_sharding_prop.py", line 447, in propagate_op_sharding_non_cached [rank0]: output_sharding = sharding_prop_func(op_schema) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "/home/rupert/qwen2.5-vl-test/.venv/lib/python3.12/site-packages/torch/distributed/tensor/_ops/_conv_ops.py", line 29, in convolution_rules [rank0]: assert isinstance(bias_spec, DTensorSpec) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: AssertionError ``` from this exception (callstack abridged): ``` [rank0]: File "/home/rupert/qwen2.5-vl-test/.venv/lib/python3.12/site-packages/torch/distributed/tensor/_sharding_prop.py", line 451, in propagate_op_sharding_non_cached [rank0]: raise RuntimeError( [rank0]: RuntimeError: Sharding propagation failed on op Op(op=aten.convolution.default, args_schema=Spec(R on (1064, 3, 2, 14, 14)), Spec(R on (1280, 3, 2, 14, 14)), None, [2, 14, 14], [0, 0, 0], [1, 1, 1], False, [0, 0, 0], 1 @ mesh: (1,)). ``` This seems independent of the number of processes specified. Minimal example: ```` from transformers import Qwen2_5_VLForConditionalGeneration, AutoProcessor, AutoTokenizer from qwen_vl_utils import process_vision_info model_id = "Qwen/Qwen2.5-VL-3B-Instruct" model = Qwen2_5_VLForConditionalGeneration.from_pretrained(model_id, tp_plan="auto") processor = AutoProcessor.from_pretrained(model_id) tokenizer = AutoTokenizer.from_pretrained(model_id) msgs = [{"role": "user","content": [ {"type": "image", "image": "https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg"}, {"type": "text", "text": "What is shown in this picture?"}]}] text = processor.apply_chat_template(msgs, tokenize=False, add_generation_prompt=True) image_inputs, video_inputs = process_vision_info(msgs) inputs = processor(text=[text], images=image_inputs, videos=video_inputs, padding=True, return_tensors="pt") inputs.to(model.device) model.generate(**inputs) ```` Ran with `torchrun --nproc-per-node=1 test.py` Same issue persists when actually using 2 or 4 GPUs and specifying `--nproc-per-node=2` or `--nproc-per-node=4`. The issue seems to be with the sharding of the `conv3d-layers` and reproducible every time. I can also provide callstacks, if preferred (not sure because of the issue template). When removing the image input from `msgs`, the issue seems to be mixing `Tensors` and `DTensors` instead. Not sure if this is related, but might be a similar issue. I've tried to adapt the `base_model_tp_plan` that is [specified in transformers for these models](https://github.com/huggingface/transformers/blob/903944a411c35b8f7b2b51ada77a7c2a80e7fb88/src/transformers/models/qwen2_vl/configuration_qwen2_vl.py#L173) myself (and rebuilding transformers with that). To my understanding it might not be wanted to shard the ViT - which probably is not the issue here, but I'm kinda lost at this point. This issue seems to also exist for the Qwen-2-VL-family, which the `base_model_tp_plan` is derived from. ### Expected behavior Normally, the outputs would be generated.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39399/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/huggingface/transformers/issues/39399/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39398
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39398/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39398/comments
https://api.github.com/repos/huggingface/transformers/issues/39398/events
https://github.com/huggingface/transformers/pull/39398
3,228,114,619
PR_kwDOCUB6oc6exJiA
39,398
Fix Lfm2 and common tests
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T09:46:58
2025-07-14T10:03:00
2025-07-14T10:02:59
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39398", "html_url": "https://github.com/huggingface/transformers/pull/39398", "diff_url": "https://github.com/huggingface/transformers/pull/39398.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39398.patch", "merged_at": "2025-07-14T10:02:59" }
# What does this PR do? cc @ydshieh
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39398/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39398/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39397
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39397/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39397/comments
https://api.github.com/repos/huggingface/transformers/issues/39397/events
https://github.com/huggingface/transformers/pull/39397
3,227,931,311
PR_kwDOCUB6oc6ewhdg
39,397
[RoPE] allow models to configure local RoPE
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T08:51:45
2025-08-07T13:36:24
2025-08-07T13:36:16
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39397", "html_url": "https://github.com/huggingface/transformers/pull/39397", "diff_url": "https://github.com/huggingface/transformers/pull/39397.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39397.patch", "merged_at": null }
# What does this PR do? Failing tests reported by CI bot after removing deprecations in https://github.com/huggingface/transformers/pull/38838 The main fixes are in test files and in ModernBert/Llama. The test was failing because RoPE needs to use local theta in local attention layers. I think at this point we can allow `local_rope_theta`, looks like ModernBert isn't the only arch using two different RoPE configurations
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39397/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39397/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39396
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39396/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39396/comments
https://api.github.com/repos/huggingface/transformers/issues/39396/events
https://github.com/huggingface/transformers/pull/39396
3,227,684,408
PR_kwDOCUB6oc6evrc_
39,396
[gemma3] fix bidirectional image mask
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T07:31:05
2025-07-22T08:04:57
2025-07-22T08:04:56
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39396", "html_url": "https://github.com/huggingface/transformers/pull/39396", "diff_url": "https://github.com/huggingface/transformers/pull/39396.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39396.patch", "merged_at": "2025-07-22T08:04:56" }
# What does this PR do? Fixes https://github.com/huggingface/transformers/issues/39389. There were two issues. First, using `getattr` on dict will never look for a dict key thus we were skipping on `token_type_ids_mask`. Second, `kv_idx` can go beyond `token_type_ids` length because it includes the new tokens to be generated. Using a simple check on `ids.shape` fails in vmap due to dynamic control flow. One option would be to create dummy padded token type ids using `kv_idx` but I think a simple try-except is also fine
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39396/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39396/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39395
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39395/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39395/comments
https://api.github.com/repos/huggingface/transformers/issues/39395/events
https://github.com/huggingface/transformers/pull/39395
3,227,590,486
PR_kwDOCUB6oc6evXWV
39,395
🚨🚨🚨 [Trainer] Enable `average_tokens_across_devices` by default in `TrainingArguments`
{ "login": "Krish0909", "id": 134591243, "node_id": "U_kgDOCAWzCw", "avatar_url": "https://avatars.githubusercontent.com/u/134591243?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Krish0909", "html_url": "https://github.com/Krish0909", "followers_url": "https://api.github.com/users/Krish0909/followers", "following_url": "https://api.github.com/users/Krish0909/following{/other_user}", "gists_url": "https://api.github.com/users/Krish0909/gists{/gist_id}", "starred_url": "https://api.github.com/users/Krish0909/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Krish0909/subscriptions", "organizations_url": "https://api.github.com/users/Krish0909/orgs", "repos_url": "https://api.github.com/users/Krish0909/repos", "events_url": "https://api.github.com/users/Krish0909/events{/privacy}", "received_events_url": "https://api.github.com/users/Krish0909/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T06:52:43
2025-07-21T12:11:41
2025-07-21T12:11:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39395", "html_url": "https://github.com/huggingface/transformers/pull/39395", "diff_url": "https://github.com/huggingface/transformers/pull/39395.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39395.patch", "merged_at": "2025-07-21T12:11:20" }
Fixes #39392 This change improves loss calculation correctness for multi-GPU training by enabling proper token averaging across devices by default. ## What does this PR do? Changes the default value of `average_tokens_across_devices` from `False` to `True` in `TrainingArguments`. This ensures more accurate loss calculation in multi-GPU training scenarios by properly averaging tokens across devices. As noted in the original issue, this feature provides reproducibility and correctness benefits with no downsides, so there's no reason to keep it disabled by default. ## Before submitting - [x] Was this discussed/approved via a Github issue? Yes - #39392 - [x] Did you make sure to update the documentation with your changes? Yes - updated docstring ## Who can review? @zach-huggingface @SunMarc @qgallouedec (trainer maintainers)
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39395/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39395/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39393
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39393/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39393/comments
https://api.github.com/repos/huggingface/transformers/issues/39393/events
https://github.com/huggingface/transformers/pull/39393
3,227,353,412
PR_kwDOCUB6oc6eukVS
39,393
GLM-4 Update
{ "login": "zRzRzRzRzRzRzR", "id": 93239683, "node_id": "U_kgDOBY65gw", "avatar_url": "https://avatars.githubusercontent.com/u/93239683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zRzRzRzRzRzRzR", "html_url": "https://github.com/zRzRzRzRzRzRzR", "followers_url": "https://api.github.com/users/zRzRzRzRzRzRzR/followers", "following_url": "https://api.github.com/users/zRzRzRzRzRzRzR/following{/other_user}", "gists_url": "https://api.github.com/users/zRzRzRzRzRzRzR/gists{/gist_id}", "starred_url": "https://api.github.com/users/zRzRzRzRzRzRzR/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zRzRzRzRzRzRzR/subscriptions", "organizations_url": "https://api.github.com/users/zRzRzRzRzRzRzR/orgs", "repos_url": "https://api.github.com/users/zRzRzRzRzRzRzR/repos", "events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/events{/privacy}", "received_events_url": "https://api.github.com/users/zRzRzRzRzRzRzR/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T04:52:15
2025-08-02T20:08:38
2025-07-21T11:24:34
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39393", "html_url": "https://github.com/huggingface/transformers/pull/39393", "diff_url": "https://github.com/huggingface/transformers/pull/39393.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39393.patch", "merged_at": "2025-07-21T11:24:34" }
@ArthurZucker Can view the full vision of this model
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39393/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39393/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39392
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39392/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39392/comments
https://api.github.com/repos/huggingface/transformers/issues/39392/events
https://github.com/huggingface/transformers/issues/39392
3,227,282,305
I_kwDOCUB6oc7AXGuB
39,392
Enabling `average_tokens_across_devices` by default in Trainer
{ "login": "MilkClouds", "id": 26109705, "node_id": "MDQ6VXNlcjI2MTA5NzA1", "avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MilkClouds", "html_url": "https://github.com/MilkClouds", "followers_url": "https://api.github.com/users/MilkClouds/followers", "following_url": "https://api.github.com/users/MilkClouds/following{/other_user}", "gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}", "starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions", "organizations_url": "https://api.github.com/users/MilkClouds/orgs", "repos_url": "https://api.github.com/users/MilkClouds/repos", "events_url": "https://api.github.com/users/MilkClouds/events{/privacy}", "received_events_url": "https://api.github.com/users/MilkClouds/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T04:01:47
2025-07-21T12:11:22
2025-07-21T12:11:21
CONTRIBUTOR
null
null
null
null
Originally, https://github.com/huggingface/transformers/issues/34242 pointed out that GA loss calculation was wrong for device sum. So the fix https://github.com/huggingface/transformers/pull/34373 was merged into main but this feature is disabled by default, defaulting `average_tokens_across_devices` to False. I think advantage of this PR is reproducibility/correctness and disadvantage is None, so there's no reason to disable it by default. (As long as the implementation of `average_tokens_across_devices` is correct!) We may think about the case where this feature is strictly needed: - assume in GPU 0, label was [1,-100,-100,....-100] - assume in GPU 1, label was [1,1,1,...,1] Then since the loss calculation treats two GPU as same weight, unstable(since it's from only single item) loss signal from GPU 0 will dominate(50%) much more than we needed.
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39392/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39392/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39391
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39391/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39391/comments
https://api.github.com/repos/huggingface/transformers/issues/39391/events
https://github.com/huggingface/transformers/pull/39391
3,227,200,548
PR_kwDOCUB6oc6euD3U
39,391
[Docs] Fix typo in CustomTrainer compute_loss method and adjust loss reduction logic
{ "login": "MilkClouds", "id": 26109705, "node_id": "MDQ6VXNlcjI2MTA5NzA1", "avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MilkClouds", "html_url": "https://github.com/MilkClouds", "followers_url": "https://api.github.com/users/MilkClouds/followers", "following_url": "https://api.github.com/users/MilkClouds/following{/other_user}", "gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}", "starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions", "organizations_url": "https://api.github.com/users/MilkClouds/orgs", "repos_url": "https://api.github.com/users/MilkClouds/repos", "events_url": "https://api.github.com/users/MilkClouds/events{/privacy}", "received_events_url": "https://api.github.com/users/MilkClouds/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T03:00:21
2025-07-14T16:25:15
2025-07-14T16:25:06
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39391", "html_url": "https://github.com/huggingface/transformers/pull/39391", "diff_url": "https://github.com/huggingface/transformers/pull/39391.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39391.patch", "merged_at": "2025-07-14T16:25:06" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes error in loss reduction logic in docs. source's implementation is correct but docs was wrong, you may refer to following. https://github.com/huggingface/transformers/blob/9bec2654ed5b4ac43e880dc7e3cb2c18aeae70a9/src/transformers/loss/loss_utils.py#L28-L42 https://github.com/huggingface/transformers/blob/a1ad9197c5756858e9014a0e01fe5fb1791efdf2/docs/source/en/trainer.md#L189-L201 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @stevhliu may be able to check this! <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39391/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39391/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39390
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39390/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39390/comments
https://api.github.com/repos/huggingface/transformers/issues/39390/events
https://github.com/huggingface/transformers/pull/39390
3,227,177,666
PR_kwDOCUB6oc6et_AC
39,390
fix CI bug for shieldgemma2 model
{ "login": "kaixuanliu", "id": 13268042, "node_id": "MDQ6VXNlcjEzMjY4MDQy", "avatar_url": "https://avatars.githubusercontent.com/u/13268042?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kaixuanliu", "html_url": "https://github.com/kaixuanliu", "followers_url": "https://api.github.com/users/kaixuanliu/followers", "following_url": "https://api.github.com/users/kaixuanliu/following{/other_user}", "gists_url": "https://api.github.com/users/kaixuanliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/kaixuanliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kaixuanliu/subscriptions", "organizations_url": "https://api.github.com/users/kaixuanliu/orgs", "repos_url": "https://api.github.com/users/kaixuanliu/repos", "events_url": "https://api.github.com/users/kaixuanliu/events{/privacy}", "received_events_url": "https://api.github.com/users/kaixuanliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-14T02:43:59
2025-07-14T07:18:23
2025-07-14T07:18:22
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39390", "html_url": "https://github.com/huggingface/transformers/pull/39390", "diff_url": "https://github.com/huggingface/transformers/pull/39390.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39390.patch", "merged_at": null }
@ydshieh pls help review, thx
{ "login": "kaixuanliu", "id": 13268042, "node_id": "MDQ6VXNlcjEzMjY4MDQy", "avatar_url": "https://avatars.githubusercontent.com/u/13268042?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kaixuanliu", "html_url": "https://github.com/kaixuanliu", "followers_url": "https://api.github.com/users/kaixuanliu/followers", "following_url": "https://api.github.com/users/kaixuanliu/following{/other_user}", "gists_url": "https://api.github.com/users/kaixuanliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/kaixuanliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kaixuanliu/subscriptions", "organizations_url": "https://api.github.com/users/kaixuanliu/orgs", "repos_url": "https://api.github.com/users/kaixuanliu/repos", "events_url": "https://api.github.com/users/kaixuanliu/events{/privacy}", "received_events_url": "https://api.github.com/users/kaixuanliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39390/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39390/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39389
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39389/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39389/comments
https://api.github.com/repos/huggingface/transformers/issues/39389/events
https://github.com/huggingface/transformers/issues/39389
3,226,814,806
I_kwDOCUB6oc7AVUlW
39,389
Gemma3 bidirectional mask for image tokens isn't reaching attention forward
{ "login": "brb-nv", "id": 169953907, "node_id": "U_kgDOCiFKcw", "avatar_url": "https://avatars.githubusercontent.com/u/169953907?v=4", "gravatar_id": "", "url": "https://api.github.com/users/brb-nv", "html_url": "https://github.com/brb-nv", "followers_url": "https://api.github.com/users/brb-nv/followers", "following_url": "https://api.github.com/users/brb-nv/following{/other_user}", "gists_url": "https://api.github.com/users/brb-nv/gists{/gist_id}", "starred_url": "https://api.github.com/users/brb-nv/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/brb-nv/subscriptions", "organizations_url": "https://api.github.com/users/brb-nv/orgs", "repos_url": "https://api.github.com/users/brb-nv/repos", "events_url": "https://api.github.com/users/brb-nv/events{/privacy}", "received_events_url": "https://api.github.com/users/brb-nv/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-13T20:55:44
2025-07-22T08:04:57
2025-07-22T08:04:57
NONE
null
null
null
null
### System Info Transformers version: 4.53.2 Ubuntu x86, Nvidia H100 - I've noticed previous [issue](https://github.com/huggingface/transformers/issues/38053) about bidirectional mask for image tokens and MRs fixing it merged[[1](https://github.com/huggingface/transformers/pull/38080)][[2](https://github.com/huggingface/transformers/pull/38295)]. - I was looking at the attention mask being passed into attention forward at prefill for a single image and noticed that the mask is purely causal i.e. no bidirectional mask for image tokens. - I'm afraid there might be a plumbing issue. Can someone please take a look? ### Who can help? @zucchini-nlp @ArthurZucker @Cyrilvallez ### Reproduction Script to reproduce: ``` from transformers import AutoProcessor, Gemma3ForConditionalGeneration model_dir = "path/to/gemma-3-4b-it/" model = Gemma3ForConditionalGeneration.from_pretrained(model_dir) processor = AutoProcessor.from_pretrained(model_dir) messages = [ { "role": "system", "content": [ {"type": "text", "text": "You are a helpful assistant."} ] }, { "role": "user", "content": [ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/bee.jpg"}, {"type": "text", "text": "Describe this image in detail."}, ] }, ] inputs = processor.apply_chat_template(messages, tokenize=True, return_dict=True, return_tensors="pt", add_generation_prompt=True) generate_ids = model.generate(**inputs, do_sample=False, num_beams=1, top_p=None, top_k=None, max_new_tokens=2) outputs = processor.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0] print(outputs) ``` ### Expected behavior Mask should be bidirectional for images as shown in this MR description: Reference 1: https://github.com/huggingface/transformers/issues/38053 Reference 2: https://github.com/huggingface/transformers/pull/38295 Request to also validate the multi-image case.
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39389/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39389/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39388
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39388/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39388/comments
https://api.github.com/repos/huggingface/transformers/issues/39388/events
https://github.com/huggingface/transformers/issues/39388
3,226,729,044
I_kwDOCUB6oc7AU_pU
39,388
Using resnet-18 in flax
{ "login": "aagha6", "id": 39412753, "node_id": "MDQ6VXNlcjM5NDEyNzUz", "avatar_url": "https://avatars.githubusercontent.com/u/39412753?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aagha6", "html_url": "https://github.com/aagha6", "followers_url": "https://api.github.com/users/aagha6/followers", "following_url": "https://api.github.com/users/aagha6/following{/other_user}", "gists_url": "https://api.github.com/users/aagha6/gists{/gist_id}", "starred_url": "https://api.github.com/users/aagha6/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/aagha6/subscriptions", "organizations_url": "https://api.github.com/users/aagha6/orgs", "repos_url": "https://api.github.com/users/aagha6/repos", "events_url": "https://api.github.com/users/aagha6/events{/privacy}", "received_events_url": "https://api.github.com/users/aagha6/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-13T18:37:22
2025-07-14T17:56:27
2025-07-14T16:00:27
NONE
null
null
null
null
### System Info - `transformers` version: 4.52.4 - Platform: macOS-15.5-arm64-arm-64bit - Python version: 3.9.21 - Huggingface_hub version: 0.32.4 - Safetensors version: 0.5.3 - Accelerate version: not installed - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (GPU?): 2.6.0 (False) - Tensorflow version (GPU?): 2.18.0 (False) - Flax version (CPU?/GPU?/TPU?): 0.8.5 (cpu) - Jax version: 0.4.30 - JaxLib version: 0.4.30 - Using distributed or parallel set-up in script?: No ### Who can help? @amyeroberts @Rocketknight1 ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction I'm trying to use resnet-18 in flax, for some reason, it seems like the normalization layers can't be loaded. I load the model with the command `model = FlaxResNetModel.from_pretrained("microsoft/resnet-18", dtype=jnp.float32)`. I get this error, It's also performing much worse as a backbone for my task in comparsion with `resnet-26`. `Some weights of FlaxResNetModel were not initialized from the model checkpoint at microsoft/resnet-18 and are newly initialized: {('batch_stats', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_0', 'normalization', 'var'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_1', 'normalization', 'mean'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_1', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_1', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_1', 'normalization', 'var'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_1', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_0', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_0', 'normalization', 'var'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_1', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_1', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_1', 'normalization', 'mean'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_0', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_0', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_1', 'normalization', 'var'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_1', 'normalization', 'bias'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_0', 'normalization', 'bias'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_0', 'normalization', 'bias'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_0', 'normalization', 'bias'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_1', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_0', 'normalization', 'mean'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_0', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_1', 'normalization', 'mean'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_1', 'normalization', 'bias'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_0', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_1', 'normalization', 'bias'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_0', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_0', 'normalization', 'mean'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_1', 'normalization', 'bias'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_1', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_0', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_0', 'normalization', 'mean'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_0', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_0', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_1', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_0', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_0', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_1', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_0', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_1', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_1', 'normalization', 'var'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_0', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_1', 'normalization', 'bias'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_1', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_1', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_0', 'normalization', 'scale'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_1', 'normalization', 'var'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_1', 'normalization', 'var'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_1', 'normalization', 'bias'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_0', 'normalization', 'scale'), ('batch_stats', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_1', 'normalization', 'var'), ('batch_stats', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_0', 'normalization', 'mean'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_1', 'normalization', 'scale'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_0', 'normalization', 'scale'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_0', 'normalization', 'scale'), ('params', 'encoder', 'stages', '0', 'layers', '0', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '0', 'layers', '1', 'layer', 'layer_0', 'normalization', 'bias'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_0', 'normalization', 'scale'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_1', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_1', 'normalization', 'var'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_0', 'convolution', 'kernel'), ('batch_stats', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_0', 'normalization', 'var'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_0', 'normalization', 'scale'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_0', 'normalization', 'mean'), ('batch_stats', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_1', 'normalization', 'var'), ('params', 'encoder', 'stages', '2', 'layers', '1', 'layer', 'layer_0', 'normalization', 'scale'), ('params', 'encoder', 'stages', '1', 'layers', '0', 'layer', 'layer_1', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '3', 'layers', '0', 'layer', 'layer_0', 'normalization', 'scale'), ('params', 'encoder', 'stages', '2', 'layers', '0', 'layer', 'layer_1', 'normalization', 'scale'), ('params', 'encoder', 'stages', '1', 'layers', '1', 'layer', 'layer_0', 'convolution', 'kernel'), ('params', 'encoder', 'stages', '3', 'layers', '1', 'layer', 'layer_1', 'convolution', 'kernel')}` ### Expected behavior I'm expecting to get a similar message to the message i get when loading resnet-26 `Some weights of the model checkpoint at microsoft/resnet-26 were not used when initializing FlaxResNetModel: {('params', 'classifier', '1', 'kernel'), ('params', 'classifier', '1', 'bias')}`
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39388/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39388/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39387
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39387/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39387/comments
https://api.github.com/repos/huggingface/transformers/issues/39387/events
https://github.com/huggingface/transformers/pull/39387
3,226,676,824
PR_kwDOCUB6oc6esWls
39,387
fix(image_processing_utils_fast.py): resolve KeyError in center_crop method
{ "login": "mshsheikh", "id": 160324527, "node_id": "U_kgDOCY5brw", "avatar_url": "https://avatars.githubusercontent.com/u/160324527?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mshsheikh", "html_url": "https://github.com/mshsheikh", "followers_url": "https://api.github.com/users/mshsheikh/followers", "following_url": "https://api.github.com/users/mshsheikh/following{/other_user}", "gists_url": "https://api.github.com/users/mshsheikh/gists{/gist_id}", "starred_url": "https://api.github.com/users/mshsheikh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mshsheikh/subscriptions", "organizations_url": "https://api.github.com/users/mshsheikh/orgs", "repos_url": "https://api.github.com/users/mshsheikh/repos", "events_url": "https://api.github.com/users/mshsheikh/events{/privacy}", "received_events_url": "https://api.github.com/users/mshsheikh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-13T17:26:15
2025-07-13T20:15:20
2025-07-13T20:15:20
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39387", "html_url": "https://github.com/huggingface/transformers/pull/39387", "diff_url": "https://github.com/huggingface/transformers/pull/39387.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39387.patch", "merged_at": null }
null
{ "login": "mshsheikh", "id": 160324527, "node_id": "U_kgDOCY5brw", "avatar_url": "https://avatars.githubusercontent.com/u/160324527?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mshsheikh", "html_url": "https://github.com/mshsheikh", "followers_url": "https://api.github.com/users/mshsheikh/followers", "following_url": "https://api.github.com/users/mshsheikh/following{/other_user}", "gists_url": "https://api.github.com/users/mshsheikh/gists{/gist_id}", "starred_url": "https://api.github.com/users/mshsheikh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mshsheikh/subscriptions", "organizations_url": "https://api.github.com/users/mshsheikh/orgs", "repos_url": "https://api.github.com/users/mshsheikh/repos", "events_url": "https://api.github.com/users/mshsheikh/events{/privacy}", "received_events_url": "https://api.github.com/users/mshsheikh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39387/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39387/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39386
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39386/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39386/comments
https://api.github.com/repos/huggingface/transformers/issues/39386/events
https://github.com/huggingface/transformers/pull/39386
3,226,575,677
PR_kwDOCUB6oc6esCV1
39,386
Update SAM/SAM HQ attention implementation + fix Cuda sync issues
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6886428489, "node_id": "LA_kwDOCUB6oc8AAAABmnaPSQ", "url": "https://api.github.com/repos/huggingface/transformers/labels/run-slow", "name": "run-slow", "color": "E1D519", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-13T15:09:37
2025-07-18T22:46:28
2025-07-18T22:46:28
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39386", "html_url": "https://github.com/huggingface/transformers/pull/39386", "diff_url": "https://github.com/huggingface/transformers/pull/39386.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39386.patch", "merged_at": "2025-07-18T22:46:28" }
# What does this PR do? Update SAM and SAM HQ attention implementation, and fix two lines that were causing unnecessary cuda sync when profiling with torch profiler. Mostly split from https://github.com/huggingface/transformers/pull/32317 as PR is becoming huge
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39386/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/39386/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39385
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39385/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39385/comments
https://api.github.com/repos/huggingface/transformers/issues/39385/events
https://github.com/huggingface/transformers/pull/39385
3,226,560,387
PR_kwDOCUB6oc6er_UW
39,385
Add fast image processor SAM
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-13T14:47:05
2025-07-18T17:27:16
2025-07-18T17:27:16
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39385", "html_url": "https://github.com/huggingface/transformers/pull/39385", "diff_url": "https://github.com/huggingface/transformers/pull/39385.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39385.patch", "merged_at": "2025-07-18T17:27:16" }
# What does this PR do? Split from https://github.com/huggingface/transformers/pull/32317 as this PR is becoming huge
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39385/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39385/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39384
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39384/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39384/comments
https://api.github.com/repos/huggingface/transformers/issues/39384/events
https://github.com/huggingface/transformers/pull/39384
3,226,383,535
PR_kwDOCUB6oc6ercZf
39,384
Fix invalid property
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-13T10:31:59
2025-07-15T13:18:32
2025-07-15T12:11:38
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39384", "html_url": "https://github.com/huggingface/transformers/pull/39384", "diff_url": "https://github.com/huggingface/transformers/pull/39384.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39384.patch", "merged_at": "2025-07-15T12:11:38" }
# What does this PR do? ruff detects some invalid property definitions. This PR fixes them.
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39384/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39384/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39383
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39383/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39383/comments
https://api.github.com/repos/huggingface/transformers/issues/39383/events
https://github.com/huggingface/transformers/pull/39383
3,226,336,065
PR_kwDOCUB6oc6erSGK
39,383
Enable some ruff checks for performance and readability
{ "login": "cyyever", "id": 17618148, "node_id": "MDQ6VXNlcjE3NjE4MTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cyyever", "html_url": "https://github.com/cyyever", "followers_url": "https://api.github.com/users/cyyever/followers", "following_url": "https://api.github.com/users/cyyever/following{/other_user}", "gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}", "starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cyyever/subscriptions", "organizations_url": "https://api.github.com/users/cyyever/orgs", "repos_url": "https://api.github.com/users/cyyever/repos", "events_url": "https://api.github.com/users/cyyever/events{/privacy}", "received_events_url": "https://api.github.com/users/cyyever/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-13T09:54:40
2025-07-17T13:41:05
2025-07-17T13:22:00
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39383", "html_url": "https://github.com/huggingface/transformers/pull/39383", "diff_url": "https://github.com/huggingface/transformers/pull/39383.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39383.patch", "merged_at": "2025-07-17T13:21:59" }
# What does this PR do? Enables ruff checks and fixes some code. These checks are: ``` PERF102: Checks for uses of dict.items() that discard either the key or the value when iterating over the dictionary. PLC1802: Checks for len calls on sequences in a boolean test context. PLC0208: Checks for iteration over a set literal where each element in the set is itself a literal value. ```
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39383/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39383/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39382
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39382/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39382/comments
https://api.github.com/repos/huggingface/transformers/issues/39382/events
https://github.com/huggingface/transformers/issues/39382
3,225,941,119
I_kwDOCUB6oc7AR_R_
39,382
Mask2FormerImageProcessor yields inconsistent results between single and batch inference
{ "login": "roboserg", "id": 4758917, "node_id": "MDQ6VXNlcjQ3NTg5MTc=", "avatar_url": "https://avatars.githubusercontent.com/u/4758917?v=4", "gravatar_id": "", "url": "https://api.github.com/users/roboserg", "html_url": "https://github.com/roboserg", "followers_url": "https://api.github.com/users/roboserg/followers", "following_url": "https://api.github.com/users/roboserg/following{/other_user}", "gists_url": "https://api.github.com/users/roboserg/gists{/gist_id}", "starred_url": "https://api.github.com/users/roboserg/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/roboserg/subscriptions", "organizations_url": "https://api.github.com/users/roboserg/orgs", "repos_url": "https://api.github.com/users/roboserg/repos", "events_url": "https://api.github.com/users/roboserg/events{/privacy}", "received_events_url": "https://api.github.com/users/roboserg/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-12T23:51:58
2025-08-20T08:02:55
2025-08-20T08:02:55
NONE
null
null
null
null
### The Bug Short version - processing an image alone versus in a batch with other images of different aspect ratios yields different prediction scores, even though the resolution of the image in the batch does not change (only padded pixels are added, which should be ignored with `pixel_mask`) When using `Mask2FormerImageProcessor `and `Mask2FormerForUniversalSegmentation`, processing a single image can yield different prediction scores compared to processing the same image within a batch of multiple images (especially those with varying aspect ratios). This inconsistency appears to be caused by the padding strategy during batch processing. When multiple images of different sizes are batched, the processor pads them to match the largest dimensions in the batch. This added padding seems to influence the model's predictions for the original, unpadded areas of an image. This is problematic because an image's prediction should be independent of other images in the batch. ### Who can help? @amyeroberts, @qubvel, @NielsRogge ### Information - [x] The official example scripts - [x] My own modified scripts ### Tasks - [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction The following script demonstrates the issue. It processes a single image, then the same image in a batch with another, and compares the prediction scores. The discrepancy is evident when the image processor uses padding in a batch. ```python import torch import transformers from transformers import Mask2FormerForUniversalSegmentation, Mask2FormerImageProcessor from transformers.image_utils import load_image def run_inference(inputs, model, processor): with torch.no_grad(): outputs = model(**inputs) result = processor.post_process_instance_segmentation(outputs, threshold=0.1) print("Input pixel_values shape:", inputs["pixel_values"].shape) print("number of predictions first image:", len(result[0]["segments_info"])) return result def main(): image1 = load_image("http://images.cocodataset.org/val2017/000000039769.jpg") image2 = load_image( "http://farm9.staticflickr.com/8520/8524521781_facbf765e1_z.jpg" ) print(f"Image 1: {image1.size}, Image 2: {image2.size}") model = Mask2FormerForUniversalSegmentation.from_pretrained( "facebook/mask2former-swin-tiny-coco-instance" ) model.to("cuda").eval() sizes = [(512, 512), 512, {"longest_edge": 512, "shortest_edge": 500}] for size in sizes: print("=" * 80) print(f"Using Mask2FormerImageProcessor with size: {size}") processor = Mask2FormerImageProcessor(size=size) inputs1 = processor(images=[image1], return_tensors="pt").to("cuda") inputs2 = processor(images=[image1, image2], return_tensors="pt").to("cuda") res1 = run_inference(inputs1, model, processor) res2 = run_inference(inputs2, model, processor) res1_label_id_15_scores = [ seg["score"] for seg in res1[0]["segments_info"] if seg["label_id"] == 15 ] res2_label_id_15_scores = [ seg["score"] for seg in res2[0]["segments_info"] if seg["label_id"] == 15 ] print("Scores for label_id 15 (single image):", res1_label_id_15_scores) print("Scores for label_id 15 (first image in batch):", res2_label_id_15_scores) if __name__ == "__main__": print("Transformers version:", transformers.__version__) print("PyTorch version:", torch.__version__) print() main() ``` **Observed Output** Here is the output from running the script. Notice how the scores differ for the second and third test cases. ``` Transformers version: 4.53.2 PyTorch version: 2.3.0+cu121 Image 1: (640, 480), Image 2: (480, 640) ================================================================================ Using Mask2FormerImageProcessor with size: (512, 512) Input pixel_values shape: torch.Size([1, 3, 512, 512]) number of predictions first image: 10 Input pixel_values shape: torch.Size([2, 3, 512, 512]) number of predictions first image: 10 Scores for label_id 15 (single image): [0.101966, 0.963214, 0.968419] Scores for label_id 15 (first image in batch): [0.101966, 0.963214, 0.968419] ================================================================================ Using Mask2FormerImageProcessor with size: 512 Input pixel_values shape: torch.Size([1, 3, 512, 704]) number of predictions first image: 11 Input pixel_values shape: torch.Size([2, 3, 704, 704]) number of predictions first image: 11 Scores for label_id 15 (single image): [0.973011, 0.967476] Scores for label_id 15 (first image in batch): [0.975895, 0.97153] ================================================================================ Using Mask2FormerImageProcessor with size: {'longest_edge': 512, 'shortest_edge': 500} Input pixel_values shape: torch.Size([1, 3, 384, 512]) number of predictions first image: 10 Input pixel_values shape: torch.Size([2, 3, 512, 512]) number of predictions first image: 9 Scores for label_id 15 (single image): [0.104092, 0.965268, 0.96922] Scores for label_id 15 (first image in batch): [0.976103, 0.971285] ``` ### Expected behavior The prediction scores for an image should be identical regardless of whether it is processed individually or as part of a batch. The padding applied during batching should not affect the model's output for the non-padded areas of the image. ### System Info transformers version 4.53.2 PyTorch version: 2.6.0+cu124 Platform: Linux (WSL2) Python 3.9.18
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39382/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39382/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39381
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39381/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39381/comments
https://api.github.com/repos/huggingface/transformers/issues/39381/events
https://github.com/huggingface/transformers/pull/39381
3,225,826,812
PR_kwDOCUB6oc6epoMZ
39,381
Add Apertus
{ "login": "EduardDurech", "id": 39579228, "node_id": "MDQ6VXNlcjM5NTc5MjI4", "avatar_url": "https://avatars.githubusercontent.com/u/39579228?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EduardDurech", "html_url": "https://github.com/EduardDurech", "followers_url": "https://api.github.com/users/EduardDurech/followers", "following_url": "https://api.github.com/users/EduardDurech/following{/other_user}", "gists_url": "https://api.github.com/users/EduardDurech/gists{/gist_id}", "starred_url": "https://api.github.com/users/EduardDurech/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EduardDurech/subscriptions", "organizations_url": "https://api.github.com/users/EduardDurech/orgs", "repos_url": "https://api.github.com/users/EduardDurech/repos", "events_url": "https://api.github.com/users/EduardDurech/events{/privacy}", "received_events_url": "https://api.github.com/users/EduardDurech/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-12T21:58:47
2025-09-21T17:03:18
2025-08-28T09:55:43
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39381", "html_url": "https://github.com/huggingface/transformers/pull/39381", "diff_url": "https://github.com/huggingface/transformers/pull/39381.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39381.patch", "merged_at": "2025-08-28T09:55:43" }
Pre-release of Apertus from the Swiss AI Initiative Main modifications from Llama - xIELU Activation - QK-norm @ArthurZucker
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39381/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39381/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39380
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39380/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39380/comments
https://api.github.com/repos/huggingface/transformers/issues/39380/events
https://github.com/huggingface/transformers/pull/39380
3,225,786,755
PR_kwDOCUB6oc6epffe
39,380
update docker file to use latest `timm` (for `perception_lm`)
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-12T21:18:57
2025-07-12T21:31:59
2025-07-12T21:19:37
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39380", "html_url": "https://github.com/huggingface/transformers/pull/39380", "diff_url": "https://github.com/huggingface/transformers/pull/39380.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39380.patch", "merged_at": "2025-07-12T21:19:37" }
# What does this PR do? Fix the whole `perception_lm` test suite is failing. Merge without waiting to make CI better
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39380/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39380/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39379
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39379/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39379/comments
https://api.github.com/repos/huggingface/transformers/issues/39379/events
https://github.com/huggingface/transformers/issues/39379
3,225,463,136
I_kwDOCUB6oc7AQKlg
39,379
Handling of full_text_row_masked_out_mask in mllama is incorrect.
{ "login": "nshepperd", "id": 406019, "node_id": "MDQ6VXNlcjQwNjAxOQ==", "avatar_url": "https://avatars.githubusercontent.com/u/406019?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nshepperd", "html_url": "https://github.com/nshepperd", "followers_url": "https://api.github.com/users/nshepperd/followers", "following_url": "https://api.github.com/users/nshepperd/following{/other_user}", "gists_url": "https://api.github.com/users/nshepperd/gists{/gist_id}", "starred_url": "https://api.github.com/users/nshepperd/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nshepperd/subscriptions", "organizations_url": "https://api.github.com/users/nshepperd/orgs", "repos_url": "https://api.github.com/users/nshepperd/repos", "events_url": "https://api.github.com/users/nshepperd/events{/privacy}", "received_events_url": "https://api.github.com/users/nshepperd/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2796628563, "node_id": "MDU6TGFiZWwyNzk2NjI4NTYz", "url": "https://api.github.com/repos/huggingface/transformers/labels/WIP", "name": "WIP", "color": "234C99", "default": false, "description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress" }, { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
null
[]
null
[]
2025-07-12T15:07:43
2025-08-29T12:01:21
null
NONE
null
null
null
null
### System Info N/A ### Who can help? @ArthurZucker ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction In MllamaCrossAttentionDecoderLayer: ```py hidden_states, attn_weights = self.cross_attn( hidden_states=hidden_states, attention_mask=cross_attention_mask, cross_attention_states=cross_attention_states, past_key_value=past_key_value, output_attentions=output_attentions, cache_position=cache_position, **kwargs, ) hidden_states = residual + self.cross_attn_attn_gate.tanh() * hidden_states residual = hidden_states hidden_states = self.post_attention_layernorm(hidden_states) hidden_states = self.mlp(hidden_states) if full_text_row_masked_out_mask is not None: hidden_states = full_text_row_masked_out_mask[:, 0] * hidden_states # type: ignore hidden_states = residual + self.cross_attn_mlp_gate.tanh() * hidden_states ``` the mask is not applied until after the hidden_states (which contains the result of the cross attention) has been copied to the residual. This is at odds with how it's done in the reference implementation [here](https://github.com/meta-llama/llama-models/blob/01dc8ce46fecf06b639598f715efbb4ab981fb4c/models/llama3/multimodal/model.py#L912): ```py def forward( self, x: torch.Tensor, xattn_mask: torch.Tensor, full_text_row_masked_out_mask: torch.Tensor, xattn_cache: torch.Tensor, ) -> torch.Tensor: xq = F.linear(x, self.wq.weight) bsz, seqlen, _ = x.shape xq = xq.view(bsz, seqlen, self.n_local_heads, self.head_dim) xq = self.q_norm(xq) xq = xq.transpose(1, 2) xk, xv = xattn_cache output = F.scaled_dot_product_attention(xq, xk, xv, attn_mask=xattn_mask, dropout_p=0.0) output = output * full_text_row_masked_out_mask output = output.transpose(1, 2).contiguous().reshape(bsz, seqlen, -1) out = F.linear(output, self.wo.weight) out = reduce_from_tensor_model_parallel_region(out) return out ``` The implementation is MllamaCrossAttentionDecoderLayer is clearly incorrect, as it allows text tokens that should be unable to see any image token (due to preceding them) to observe all image tokens via the residual. This may affect any usage of llama vision models with images where an image isn't the very first token (finetuning or inference) in transformers. ### Expected behavior N/A
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39379/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39379/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39417
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39417/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39417/comments
https://api.github.com/repos/huggingface/transformers/issues/39417/events
https://github.com/huggingface/transformers/issues/39417
3,231,637,194
I_kwDOCUB6oc7Ant7K
39,417
Option to tokenize messages one after the other
{ "login": "VivienCabannes", "id": 17603148, "node_id": "MDQ6VXNlcjE3NjAzMTQ4", "avatar_url": "https://avatars.githubusercontent.com/u/17603148?v=4", "gravatar_id": "", "url": "https://api.github.com/users/VivienCabannes", "html_url": "https://github.com/VivienCabannes", "followers_url": "https://api.github.com/users/VivienCabannes/followers", "following_url": "https://api.github.com/users/VivienCabannes/following{/other_user}", "gists_url": "https://api.github.com/users/VivienCabannes/gists{/gist_id}", "starred_url": "https://api.github.com/users/VivienCabannes/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/VivienCabannes/subscriptions", "organizations_url": "https://api.github.com/users/VivienCabannes/orgs", "repos_url": "https://api.github.com/users/VivienCabannes/repos", "events_url": "https://api.github.com/users/VivienCabannes/events{/privacy}", "received_events_url": "https://api.github.com/users/VivienCabannes/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6900377379, "node_id": "LA_kwDOCUB6oc8AAAABm0tnIw", "url": "https://api.github.com/repos/huggingface/transformers/labels/Chat%20Template", "name": "Chat Template", "color": "EDDB5D", "default": false, "description": "" } ]
closed
false
null
[]
null
[]
2025-07-12T10:06:51
2025-08-06T16:47:34
2025-07-31T08:55:46
NONE
null
null
null
null
In many settings, we want to perform token-to-token interaction with an assistant, meaning that it would be beneficial to tokenize messages one after the other rather than all at once. Assume I have: ```python conversation = [ {"role": "system", "content": "You are a helpful assistant"}, {"role": "user", "content": "Hey there, how are you?"}, {"role": "assistant", "content": "Thank you for asking, I am doing well"}, {"role": "user", "content": "What the weather like today?"}, {"role": "assistant", "content": "Today the weather is nice"}, ] ``` I would like to have a function `tokenizer.encode_message` so that executing: ```python tokens = [] for message in conversation: tokens += tokenizer.encode_message(message) ``` would result in the same behavior as executing: ```python tokens = tokenizer.apply_chat_template(conversation, tokenize=True) ``` At the moment, this does not seem possible (but maybe I am missing the right function to do so). So far, here is my workaround: ```python class MyDialogTokenizer: def __init__(self, tokenizer): self.tokenizer = tokenizer self._user_stub = {"role": "user", "content": "hello"} self._assistant_stub = {"role": "assistant", "content": "world"} self._len_prefix_assistant = len(self._encode_dialog([self._user_stub])) self._len_prefix_tool = len( self._encode_dialog([self._user_stub, self._assistant_stub]) ) def _encode_dialog( self, dialog: list[dict[str, str]], eom: bool = True ) -> list[int]: return self.tokenizer.apply_chat_template( dialog, tokenize=True, continue_final_message=not eom ) def encode_message(self, source: str, body: str, eom: bool = True) -> list[int]: msg_dict = {"role": source, "content": body} match source: case "system" | "user": return self._encode_dialog([msg_dict], eom=eom) case "assistant": return self._encode_dialog( [self._user_stub, msg_dict], eom=eom, )[self._len_prefix_assistant :] case _: return self._encode_dialog( [self._user_stub, self._assistant_stub, msg_dict], eom=eom, )[self._len_prefix_tool :] ``` Thank you in advance
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39417/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39417/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39378
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39378/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39378/comments
https://api.github.com/repos/huggingface/transformers/issues/39378/events
https://github.com/huggingface/transformers/pull/39378
3,225,006,520
PR_kwDOCUB6oc6em8iQ
39,378
Fix/siglip2 pooling comment
{ "login": "sameerajashyam", "id": 79054143, "node_id": "MDQ6VXNlcjc5MDU0MTQz", "avatar_url": "https://avatars.githubusercontent.com/u/79054143?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sameerajashyam", "html_url": "https://github.com/sameerajashyam", "followers_url": "https://api.github.com/users/sameerajashyam/followers", "following_url": "https://api.github.com/users/sameerajashyam/following{/other_user}", "gists_url": "https://api.github.com/users/sameerajashyam/gists{/gist_id}", "starred_url": "https://api.github.com/users/sameerajashyam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sameerajashyam/subscriptions", "organizations_url": "https://api.github.com/users/sameerajashyam/orgs", "repos_url": "https://api.github.com/users/sameerajashyam/repos", "events_url": "https://api.github.com/users/sameerajashyam/events{/privacy}", "received_events_url": "https://api.github.com/users/sameerajashyam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-12T07:45:30
2025-07-14T17:47:49
2025-07-14T17:47:20
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39378", "html_url": "https://github.com/huggingface/transformers/pull/39378", "diff_url": "https://github.com/huggingface/transformers/pull/39378.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39378.patch", "merged_at": "2025-07-14T17:47:20" }
# What does this PR do? This PR updates the Siglip2TextModel to simplify and clarify the logic for generating the pooled output. Specifically, it removes the ambiguous conditional logic and comment regarding the use of eos_token_id or padding tokens, and instead uses the last token's hidden state as the pooled representation Additional Changes: Adds a dedicated unit test in test_text_model.py to validate the forward pass behavior. Cleans up minor formatting issues in the test file (ruff compliant). <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "qubvel", "id": 31920396, "node_id": "MDQ6VXNlcjMxOTIwMzk2", "avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qubvel", "html_url": "https://github.com/qubvel", "followers_url": "https://api.github.com/users/qubvel/followers", "following_url": "https://api.github.com/users/qubvel/following{/other_user}", "gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}", "starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/qubvel/subscriptions", "organizations_url": "https://api.github.com/users/qubvel/orgs", "repos_url": "https://api.github.com/users/qubvel/repos", "events_url": "https://api.github.com/users/qubvel/events{/privacy}", "received_events_url": "https://api.github.com/users/qubvel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39378/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39378/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39377
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39377/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39377/comments
https://api.github.com/repos/huggingface/transformers/issues/39377/events
https://github.com/huggingface/transformers/issues/39377
3,224,721,244
I_kwDOCUB6oc7ANVdc
39,377
FlashAttention2 support for GSAI-ML / LLaDA-8B-Instruct?
{ "login": "lbertge", "id": 4899332, "node_id": "MDQ6VXNlcjQ4OTkzMzI=", "avatar_url": "https://avatars.githubusercontent.com/u/4899332?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lbertge", "html_url": "https://github.com/lbertge", "followers_url": "https://api.github.com/users/lbertge/followers", "following_url": "https://api.github.com/users/lbertge/following{/other_user}", "gists_url": "https://api.github.com/users/lbertge/gists{/gist_id}", "starred_url": "https://api.github.com/users/lbertge/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lbertge/subscriptions", "organizations_url": "https://api.github.com/users/lbertge/orgs", "repos_url": "https://api.github.com/users/lbertge/repos", "events_url": "https://api.github.com/users/lbertge/events{/privacy}", "received_events_url": "https://api.github.com/users/lbertge/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-12T02:48:36
2025-08-19T08:03:26
2025-08-19T08:03:26
NONE
null
null
null
null
Hi there, I attempted to use flash attention 2 with this model but it seems like it isn't supported, based on this error: ``` ValueError: LLaDAModelLM does not support Flash Attention 2.0 yet. Please request to add support where the model is hosted, on its model hub page: https://huggingface.co/GSAI-ML/LLaDA-8B-Instruct/discussions/new or in the Transformers GitHub repo: https://github.com/huggingface/transformers/issues/new ``` would it be possible to add support to this kind of model? Thank you for your time!
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39377/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39377/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39376
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39376/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39376/comments
https://api.github.com/repos/huggingface/transformers/issues/39376/events
https://github.com/huggingface/transformers/pull/39376
3,224,378,115
PR_kwDOCUB6oc6ek2fX
39,376
Set do_convert_rgb default in processing_gemma3.py
{ "login": "MohitIntel", "id": 49886570, "node_id": "MDQ6VXNlcjQ5ODg2NTcw", "avatar_url": "https://avatars.githubusercontent.com/u/49886570?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MohitIntel", "html_url": "https://github.com/MohitIntel", "followers_url": "https://api.github.com/users/MohitIntel/followers", "following_url": "https://api.github.com/users/MohitIntel/following{/other_user}", "gists_url": "https://api.github.com/users/MohitIntel/gists{/gist_id}", "starred_url": "https://api.github.com/users/MohitIntel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MohitIntel/subscriptions", "organizations_url": "https://api.github.com/users/MohitIntel/orgs", "repos_url": "https://api.github.com/users/MohitIntel/repos", "events_url": "https://api.github.com/users/MohitIntel/events{/privacy}", "received_events_url": "https://api.github.com/users/MohitIntel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T22:30:26
2025-07-16T23:46:45
2025-07-16T23:46:45
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39376", "html_url": "https://github.com/huggingface/transformers/pull/39376", "diff_url": "https://github.com/huggingface/transformers/pull/39376.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39376.patch", "merged_at": null }
# What does this PR do? Ensures correct default value for `do_convert_rgb` parameter is set otherwise lm-eval on this model results in "ValueError: Unable to infer channel dimension format". This patch is needed in conjunction with https://github.com/huggingface/transformers/pull/39375 for lm-eval to work on this model with the following command: ``` lm_eval --model vllm-vlm --model_args '{"pretrained":"google/gemma-3-27b-pt","tensor_parallel_size":1,"distributed_executor_backend":"mp","trust_remote_code":true,"max_model_len":4096,"use_v2_block_manager":true,"dtype":"bfloat16","max_num_seqs":128,"limit_mm_per_prompt":{"image":10}}' --tasks mmmu_val_economics --batch_size 128 --num_fewshot 8 ``` <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "MohitIntel", "id": 49886570, "node_id": "MDQ6VXNlcjQ5ODg2NTcw", "avatar_url": "https://avatars.githubusercontent.com/u/49886570?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MohitIntel", "html_url": "https://github.com/MohitIntel", "followers_url": "https://api.github.com/users/MohitIntel/followers", "following_url": "https://api.github.com/users/MohitIntel/following{/other_user}", "gists_url": "https://api.github.com/users/MohitIntel/gists{/gist_id}", "starred_url": "https://api.github.com/users/MohitIntel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MohitIntel/subscriptions", "organizations_url": "https://api.github.com/users/MohitIntel/orgs", "repos_url": "https://api.github.com/users/MohitIntel/repos", "events_url": "https://api.github.com/users/MohitIntel/events{/privacy}", "received_events_url": "https://api.github.com/users/MohitIntel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39376/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39376/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39375
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39375/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39375/comments
https://api.github.com/repos/huggingface/transformers/issues/39375/events
https://github.com/huggingface/transformers/pull/39375
3,224,372,681
PR_kwDOCUB6oc6ek1TF
39,375
Fix do_convert_rgb default value
{ "login": "MohitIntel", "id": 49886570, "node_id": "MDQ6VXNlcjQ5ODg2NTcw", "avatar_url": "https://avatars.githubusercontent.com/u/49886570?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MohitIntel", "html_url": "https://github.com/MohitIntel", "followers_url": "https://api.github.com/users/MohitIntel/followers", "following_url": "https://api.github.com/users/MohitIntel/following{/other_user}", "gists_url": "https://api.github.com/users/MohitIntel/gists{/gist_id}", "starred_url": "https://api.github.com/users/MohitIntel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MohitIntel/subscriptions", "organizations_url": "https://api.github.com/users/MohitIntel/orgs", "repos_url": "https://api.github.com/users/MohitIntel/repos", "events_url": "https://api.github.com/users/MohitIntel/events{/privacy}", "received_events_url": "https://api.github.com/users/MohitIntel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T22:26:39
2025-07-16T23:47:22
2025-07-16T23:47:21
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39375", "html_url": "https://github.com/huggingface/transformers/pull/39375", "diff_url": "https://github.com/huggingface/transformers/pull/39375.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39375.patch", "merged_at": null }
# What does this PR do? Sets the default value for optional parameter `do_convert_rgb` to True as described in the class definition in Line 81. Otherwise, lm-eval on this model results in "ValueError: Unable to infer channel dimension format." ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests?
{ "login": "MohitIntel", "id": 49886570, "node_id": "MDQ6VXNlcjQ5ODg2NTcw", "avatar_url": "https://avatars.githubusercontent.com/u/49886570?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MohitIntel", "html_url": "https://github.com/MohitIntel", "followers_url": "https://api.github.com/users/MohitIntel/followers", "following_url": "https://api.github.com/users/MohitIntel/following{/other_user}", "gists_url": "https://api.github.com/users/MohitIntel/gists{/gist_id}", "starred_url": "https://api.github.com/users/MohitIntel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/MohitIntel/subscriptions", "organizations_url": "https://api.github.com/users/MohitIntel/orgs", "repos_url": "https://api.github.com/users/MohitIntel/repos", "events_url": "https://api.github.com/users/MohitIntel/events{/privacy}", "received_events_url": "https://api.github.com/users/MohitIntel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39375/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39375/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39374
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39374/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39374/comments
https://api.github.com/repos/huggingface/transformers/issues/39374/events
https://github.com/huggingface/transformers/pull/39374
3,224,133,022
PR_kwDOCUB6oc6ekBdu
39,374
fix: ImageTextToTextPipeline handles user-defined generation_config
{ "login": "peteryschneider", "id": 4386038, "node_id": "MDQ6VXNlcjQzODYwMzg=", "avatar_url": "https://avatars.githubusercontent.com/u/4386038?v=4", "gravatar_id": "", "url": "https://api.github.com/users/peteryschneider", "html_url": "https://github.com/peteryschneider", "followers_url": "https://api.github.com/users/peteryschneider/followers", "following_url": "https://api.github.com/users/peteryschneider/following{/other_user}", "gists_url": "https://api.github.com/users/peteryschneider/gists{/gist_id}", "starred_url": "https://api.github.com/users/peteryschneider/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/peteryschneider/subscriptions", "organizations_url": "https://api.github.com/users/peteryschneider/orgs", "repos_url": "https://api.github.com/users/peteryschneider/repos", "events_url": "https://api.github.com/users/peteryschneider/events{/privacy}", "received_events_url": "https://api.github.com/users/peteryschneider/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T20:28:00
2025-07-17T13:23:30
2025-07-17T13:23:30
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39374", "html_url": "https://github.com/huggingface/transformers/pull/39374", "diff_url": "https://github.com/huggingface/transformers/pull/39374.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39374.patch", "merged_at": "2025-07-17T13:23:30" }
# What does this PR do? This allows an ImageTextToTextPipeline object to handle a user-defined generation_config passed to the pipeline, the same way a TextGenerationPipeline object does. Currently, it ignores these generation_config parameters set on pipeline object initialization, causing unexpected text generation behavior. @Rocketknight1
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39374/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39374/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39373
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39373/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39373/comments
https://api.github.com/repos/huggingface/transformers/issues/39373/events
https://github.com/huggingface/transformers/pull/39373
3,224,107,991
PR_kwDOCUB6oc6ej8HR
39,373
Remove residual quantization attribute from dequantized models
{ "login": "DWarez", "id": 10366381, "node_id": "MDQ6VXNlcjEwMzY2Mzgx", "avatar_url": "https://avatars.githubusercontent.com/u/10366381?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DWarez", "html_url": "https://github.com/DWarez", "followers_url": "https://api.github.com/users/DWarez/followers", "following_url": "https://api.github.com/users/DWarez/following{/other_user}", "gists_url": "https://api.github.com/users/DWarez/gists{/gist_id}", "starred_url": "https://api.github.com/users/DWarez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DWarez/subscriptions", "organizations_url": "https://api.github.com/users/DWarez/orgs", "repos_url": "https://api.github.com/users/DWarez/repos", "events_url": "https://api.github.com/users/DWarez/events{/privacy}", "received_events_url": "https://api.github.com/users/DWarez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T20:15:48
2025-07-15T15:16:10
2025-07-15T15:16:10
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39373", "html_url": "https://github.com/huggingface/transformers/pull/39373", "diff_url": "https://github.com/huggingface/transformers/pull/39373.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39373.patch", "merged_at": "2025-07-15T15:16:10" }
# What does this PR do? The `quantization_method` attribute was not being deleted during model dequantization, causing methods like `to()` to fail even when the operation should be valid for dequantized models. Fixes # 39295 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. https://github.com/huggingface/transformers/issues/39295 - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. @SunMarc
{ "login": "SunMarc", "id": 57196510, "node_id": "MDQ6VXNlcjU3MTk2NTEw", "avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SunMarc", "html_url": "https://github.com/SunMarc", "followers_url": "https://api.github.com/users/SunMarc/followers", "following_url": "https://api.github.com/users/SunMarc/following{/other_user}", "gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}", "starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions", "organizations_url": "https://api.github.com/users/SunMarc/orgs", "repos_url": "https://api.github.com/users/SunMarc/repos", "events_url": "https://api.github.com/users/SunMarc/events{/privacy}", "received_events_url": "https://api.github.com/users/SunMarc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39373/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39373/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39372
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39372/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39372/comments
https://api.github.com/repos/huggingface/transformers/issues/39372/events
https://github.com/huggingface/transformers/pull/39372
3,223,716,803
PR_kwDOCUB6oc6eiluT
39,372
Change log level from warning to info for scheduled request logging in `ContinuousBatchProcessor`
{ "login": "qgallouedec", "id": 45557362, "node_id": "MDQ6VXNlcjQ1NTU3MzYy", "avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qgallouedec", "html_url": "https://github.com/qgallouedec", "followers_url": "https://api.github.com/users/qgallouedec/followers", "following_url": "https://api.github.com/users/qgallouedec/following{/other_user}", "gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}", "starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions", "organizations_url": "https://api.github.com/users/qgallouedec/orgs", "repos_url": "https://api.github.com/users/qgallouedec/repos", "events_url": "https://api.github.com/users/qgallouedec/events{/privacy}", "received_events_url": "https://api.github.com/users/qgallouedec/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T17:35:15
2025-07-16T09:54:22
2025-07-16T09:54:21
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39372", "html_url": "https://github.com/huggingface/transformers/pull/39372", "diff_url": "https://github.com/huggingface/transformers/pull/39372.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39372.patch", "merged_at": "2025-07-16T09:54:21" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39372/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39372/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39371
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39371/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39371/comments
https://api.github.com/repos/huggingface/transformers/issues/39371/events
https://github.com/huggingface/transformers/pull/39371
3,223,671,868
PR_kwDOCUB6oc6eibyU
39,371
fix(siglip2): clarify text pooling logic and remove misleading EOS co…
{ "login": "sameerajashyam", "id": 79054143, "node_id": "MDQ6VXNlcjc5MDU0MTQz", "avatar_url": "https://avatars.githubusercontent.com/u/79054143?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sameerajashyam", "html_url": "https://github.com/sameerajashyam", "followers_url": "https://api.github.com/users/sameerajashyam/followers", "following_url": "https://api.github.com/users/sameerajashyam/following{/other_user}", "gists_url": "https://api.github.com/users/sameerajashyam/gists{/gist_id}", "starred_url": "https://api.github.com/users/sameerajashyam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sameerajashyam/subscriptions", "organizations_url": "https://api.github.com/users/sameerajashyam/orgs", "repos_url": "https://api.github.com/users/sameerajashyam/repos", "events_url": "https://api.github.com/users/sameerajashyam/events{/privacy}", "received_events_url": "https://api.github.com/users/sameerajashyam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T17:18:10
2025-08-19T16:14:59
2025-08-19T16:14:59
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39371", "html_url": "https://github.com/huggingface/transformers/pull/39371", "diff_url": "https://github.com/huggingface/transformers/pull/39371.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39371.patch", "merged_at": null }
…mment # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "sameerajashyam", "id": 79054143, "node_id": "MDQ6VXNlcjc5MDU0MTQz", "avatar_url": "https://avatars.githubusercontent.com/u/79054143?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sameerajashyam", "html_url": "https://github.com/sameerajashyam", "followers_url": "https://api.github.com/users/sameerajashyam/followers", "following_url": "https://api.github.com/users/sameerajashyam/following{/other_user}", "gists_url": "https://api.github.com/users/sameerajashyam/gists{/gist_id}", "starred_url": "https://api.github.com/users/sameerajashyam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sameerajashyam/subscriptions", "organizations_url": "https://api.github.com/users/sameerajashyam/orgs", "repos_url": "https://api.github.com/users/sameerajashyam/repos", "events_url": "https://api.github.com/users/sameerajashyam/events{/privacy}", "received_events_url": "https://api.github.com/users/sameerajashyam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39371/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39371/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39370
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39370/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39370/comments
https://api.github.com/repos/huggingface/transformers/issues/39370/events
https://github.com/huggingface/transformers/pull/39370
3,223,667,136
PR_kwDOCUB6oc6eiatd
39,370
Support FP8 accelerate config
{ "login": "djsaunde", "id": 1245942, "node_id": "MDQ6VXNlcjEyNDU5NDI=", "avatar_url": "https://avatars.githubusercontent.com/u/1245942?v=4", "gravatar_id": "", "url": "https://api.github.com/users/djsaunde", "html_url": "https://github.com/djsaunde", "followers_url": "https://api.github.com/users/djsaunde/followers", "following_url": "https://api.github.com/users/djsaunde/following{/other_user}", "gists_url": "https://api.github.com/users/djsaunde/gists{/gist_id}", "starred_url": "https://api.github.com/users/djsaunde/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/djsaunde/subscriptions", "organizations_url": "https://api.github.com/users/djsaunde/orgs", "repos_url": "https://api.github.com/users/djsaunde/repos", "events_url": "https://api.github.com/users/djsaunde/events{/privacy}", "received_events_url": "https://api.github.com/users/djsaunde/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T17:16:47
2025-08-07T16:48:27
2025-08-07T16:48:27
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39370", "html_url": "https://github.com/huggingface/transformers/pull/39370", "diff_url": "https://github.com/huggingface/transformers/pull/39370.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39370.patch", "merged_at": null }
# What does this PR do? Adds config options for mixed precision and fp8 config, which are supported in accelerate's Accelerator object. Also parses the `config` field for the torchao ("AO") fp8 backend from dictionary values into the required `Float8LinearConfig` object. This requires also a simple gating change to accelerate, which is actually covered by an existing PR: https://github.com/huggingface/accelerate/pull/3677/files#diff-2d7515874eaecac2687c7fc1a9c720be53f802bf14b4c3dcebe14ad443d075dcR501-R505. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? - @S1ro1 - @SunMarc - @winglian ## TODO - [ ] Add docs (?) - [ ] Add tests (?) - [ ] Add quick benchmarks ## Note I've tested this downstream in [`axolotl`](https://github.com/axolotl-ai-cloud/axolotl) and find that it works. Performance benefits can be had for certain models when setting `torch_compile: true`; it's not yet clear to me which models + hyperparameter settings. I'll add some quick numbers here to demonstrate this.
{ "login": "djsaunde", "id": 1245942, "node_id": "MDQ6VXNlcjEyNDU5NDI=", "avatar_url": "https://avatars.githubusercontent.com/u/1245942?v=4", "gravatar_id": "", "url": "https://api.github.com/users/djsaunde", "html_url": "https://github.com/djsaunde", "followers_url": "https://api.github.com/users/djsaunde/followers", "following_url": "https://api.github.com/users/djsaunde/following{/other_user}", "gists_url": "https://api.github.com/users/djsaunde/gists{/gist_id}", "starred_url": "https://api.github.com/users/djsaunde/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/djsaunde/subscriptions", "organizations_url": "https://api.github.com/users/djsaunde/orgs", "repos_url": "https://api.github.com/users/djsaunde/repos", "events_url": "https://api.github.com/users/djsaunde/events{/privacy}", "received_events_url": "https://api.github.com/users/djsaunde/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39370/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39370/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39369
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39369/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39369/comments
https://api.github.com/repos/huggingface/transformers/issues/39369/events
https://github.com/huggingface/transformers/pull/39369
3,223,488,821
PR_kwDOCUB6oc6ehyjJ
39,369
Fix `fix_and_overwrite` mode of `utils/check_docstring.py`
{ "login": "manueldeprada", "id": 6536835, "node_id": "MDQ6VXNlcjY1MzY4MzU=", "avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4", "gravatar_id": "", "url": "https://api.github.com/users/manueldeprada", "html_url": "https://github.com/manueldeprada", "followers_url": "https://api.github.com/users/manueldeprada/followers", "following_url": "https://api.github.com/users/manueldeprada/following{/other_user}", "gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}", "starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions", "organizations_url": "https://api.github.com/users/manueldeprada/orgs", "repos_url": "https://api.github.com/users/manueldeprada/repos", "events_url": "https://api.github.com/users/manueldeprada/events{/privacy}", "received_events_url": "https://api.github.com/users/manueldeprada/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T16:20:16
2025-08-06T17:37:26
2025-08-06T17:37:25
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39369", "html_url": "https://github.com/huggingface/transformers/pull/39369", "diff_url": "https://github.com/huggingface/transformers/pull/39369.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39369.patch", "merged_at": "2025-08-06T17:37:25" }
Fixed it @ydshieh :) The script was failing to apply fixes because of a mismatch in how it reads the docstring for comparison versus how it reads it for replacement. - To detect a mismatch, the script reads the docstring from the `obj.__doc__` attribute. Python automatically un-indents the string provided by `__doc__`. - To apply a fix, the script reads the code from the source file using `inspect.getsourcelines()`, which preserves the original indentation. The `fix_docstring` function would first try to locate the argument block within the raw source code. To ensure it had found the right place, it would compare the text from the source (with indentation) against the text from `__doc__` (without indentation). This comparison would then fail. Probably the check we talked about is not needed anymore, maybe should we remove it? Or leave it as-is to be extra cautious?
{ "login": "manueldeprada", "id": 6536835, "node_id": "MDQ6VXNlcjY1MzY4MzU=", "avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4", "gravatar_id": "", "url": "https://api.github.com/users/manueldeprada", "html_url": "https://github.com/manueldeprada", "followers_url": "https://api.github.com/users/manueldeprada/followers", "following_url": "https://api.github.com/users/manueldeprada/following{/other_user}", "gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}", "starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions", "organizations_url": "https://api.github.com/users/manueldeprada/orgs", "repos_url": "https://api.github.com/users/manueldeprada/repos", "events_url": "https://api.github.com/users/manueldeprada/events{/privacy}", "received_events_url": "https://api.github.com/users/manueldeprada/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39369/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39369/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39368
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39368/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39368/comments
https://api.github.com/repos/huggingface/transformers/issues/39368/events
https://github.com/huggingface/transformers/pull/39368
3,223,431,818
PR_kwDOCUB6oc6ehl44
39,368
Add standardized model card for facebook/data2vec-audio-base-960h
{ "login": "boy397", "id": 154273578, "node_id": "U_kgDOCTIHKg", "avatar_url": "https://avatars.githubusercontent.com/u/154273578?v=4", "gravatar_id": "", "url": "https://api.github.com/users/boy397", "html_url": "https://github.com/boy397", "followers_url": "https://api.github.com/users/boy397/followers", "following_url": "https://api.github.com/users/boy397/following{/other_user}", "gists_url": "https://api.github.com/users/boy397/gists{/gist_id}", "starred_url": "https://api.github.com/users/boy397/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/boy397/subscriptions", "organizations_url": "https://api.github.com/users/boy397/orgs", "repos_url": "https://api.github.com/users/boy397/repos", "events_url": "https://api.github.com/users/boy397/events{/privacy}", "received_events_url": "https://api.github.com/users/boy397/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T16:01:33
2025-09-18T19:36:01
2025-09-18T17:27:18
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39368", "html_url": "https://github.com/huggingface/transformers/pull/39368", "diff_url": "https://github.com/huggingface/transformers/pull/39368.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39368.patch", "merged_at": null }
This PR adds a standardized model card for [`facebook/data2vec-audio-base-960h`](https://huggingface.co/facebook/data2vec-audio-base-960h) as part of issue [#36979](https://github.com/huggingface/transformers/issues/36979). ### What's included: - Friendly, accessible model description - Pipeline usage example for speech recognition - AutoModel + AutoProcessor example - Info about quantization (if applicable) - Uses the latest formatting standards for model cards This model card will improve discoverability and usability for the speech model by aligning with Hugging Face documentation standards.
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39368/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39368/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39367
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39367/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39367/comments
https://api.github.com/repos/huggingface/transformers/issues/39367/events
https://github.com/huggingface/transformers/issues/39367
3,223,378,233
I_kwDOCUB6oc7AINk5
39,367
Adding api key to `transformers serve`
{ "login": "xhluca", "id": 21180505, "node_id": "MDQ6VXNlcjIxMTgwNTA1", "avatar_url": "https://avatars.githubusercontent.com/u/21180505?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xhluca", "html_url": "https://github.com/xhluca", "followers_url": "https://api.github.com/users/xhluca/followers", "following_url": "https://api.github.com/users/xhluca/following{/other_user}", "gists_url": "https://api.github.com/users/xhluca/gists{/gist_id}", "starred_url": "https://api.github.com/users/xhluca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xhluca/subscriptions", "organizations_url": "https://api.github.com/users/xhluca/orgs", "repos_url": "https://api.github.com/users/xhluca/repos", "events_url": "https://api.github.com/users/xhluca/events{/privacy}", "received_events_url": "https://api.github.com/users/xhluca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
[]
2025-07-11T15:45:30
2025-08-17T18:42:16
null
CONTRIBUTOR
null
null
null
null
### Feature request Enabling setting an api-key when serving in a way that's compatible with the openai api. ### Motivation This would be a great feature to make our endpoint accessible over the internet. ### Your contribution I'm not sure where to start to make a PR for this. Any pointer?
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39367/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39367/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39366
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39366/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39366/comments
https://api.github.com/repos/huggingface/transformers/issues/39366/events
https://github.com/huggingface/transformers/issues/39366
3,223,265,823
I_kwDOCUB6oc7AHyIf
39,366
RuntimeError when loading llmcompressor W8A8 quantized model: int8 dtype in weight initialization
{ "login": "AdelineXinyi", "id": 89888667, "node_id": "MDQ6VXNlcjg5ODg4NjY3", "avatar_url": "https://avatars.githubusercontent.com/u/89888667?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AdelineXinyi", "html_url": "https://github.com/AdelineXinyi", "followers_url": "https://api.github.com/users/AdelineXinyi/followers", "following_url": "https://api.github.com/users/AdelineXinyi/following{/other_user}", "gists_url": "https://api.github.com/users/AdelineXinyi/gists{/gist_id}", "starred_url": "https://api.github.com/users/AdelineXinyi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AdelineXinyi/subscriptions", "organizations_url": "https://api.github.com/users/AdelineXinyi/orgs", "repos_url": "https://api.github.com/users/AdelineXinyi/repos", "events_url": "https://api.github.com/users/AdelineXinyi/events{/privacy}", "received_events_url": "https://api.github.com/users/AdelineXinyi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1990918270, "node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue", "name": "Good First Issue", "color": "bbf794", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-07-11T15:15:09
2025-10-18T00:04:57
null
NONE
null
null
null
null
I'm trying to load the quantized model `RedHatAI/Qwen2.5-VL-7B-Instruct-quantized.w8a8` but encountering a dtype compatibility issue during model initialization. The model appears to be quantized using `llmcompressor` with W8A8 quantization scheme. **Note**: I need to load this model without vLLM because I may need to add custom hooks for my research, so I'm looking for a direct loading method using transformers/llmcompressor. ## Error Message ```python RuntimeError: expected a floating-point or complex dtype, but got dtype=torch.int8 ``` **Full Stack Trace:** ```python File "/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py", line 366, in _init_weights module.weight.data.normal_(mean=0.0, std=std) File "/torch/_refs/__init__.py", line 6214, in normal_ return normal(mean, std, self.shape, out=self, generator=generator) ... RuntimeError: expected a floating-point or complex dtype, but got dtype=torch.int8 ``` ## Traceback The error occurs during model weight initialization where transformers tries to call `normal_()` on int8 tensors. The `normal_()` function in PyTorch only works with floating-point tensors, but the quantized model contains int8 weights. **Specific failure point:** - File: `modeling_qwen2_5_vl.py`, line 366 - Function: `_init_weights()` - Operation: `module.weight.data.normal_(mean=0.0, std=std)` - Issue: Trying to apply normal distribution to int8 tensors ## Model Information Based on the model's `config.json`: - **Quantization method**: `compressed-tensors` - **Format**: `int-quantized` - **Scheme**: W8A8 (8-bit weights and activations) - **Base model**: `Qwen/Qwen2.5-VL-7B-Instruct` - **Compression ratio**: ~1.2x - **Ignored layers**: All visual layers (`visual.blocks.*`, `visual.merger.*`, `lm_head`) ## What I've Tried ### 1. llmcompressor methods: ```python # Method 1: TraceableQwen2_5_VLForConditionalGeneration from llmcompressor.transformers.tracing import TraceableQwen2_5_VLForConditionalGeneration model = TraceableQwen2_5_VLForConditionalGeneration.from_pretrained( model_path, device_map="auto", torch_dtype="auto", trust_remote_code=True ) # Method 2: SparseAutoModelForCausalLM from llmcompressor.transformers import SparseAutoModelForCausalLM model = SparseAutoModelForCausalLM.from_pretrained( model_path, device_map="auto", torch_dtype="auto", trust_remote_code=True ) ``` ### 2. Standard transformers methods: ```python # Method 3: Various dtype configurations model = Qwen2_5_VLForConditionalGeneration.from_pretrained( model_path, torch_dtype=torch.bfloat16, # Also tried: torch.float16, "auto", None trust_remote_code=True, device_map="auto" ) # Method 4: AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained( model_path, trust_remote_code=True, torch_dtype="auto" ) ``` **All methods fail at the same weight initialization step, so I wonder should the model be loaded with `_fast_init=False` or other special parameters?** ## Additional Observations 1. **Warning about ignored layers**: The loader warns about missing visual layers, but this seems expected since they were ignored during quantization 2. **Model files exist**: The quantized model directory contains the expected `.safetensors` files and configuration 3. **Original model works**: The base `Qwen/Qwen2.5-VL-7B-Instruct` loads and works perfectly ## Environment - **Python**: 3.10 - **PyTorch**: 2.7.0+cu126 - **Transformers**: 4.52.4 - **LLMCompressor**: 0.6.0 - **Compressed-tensors**: 0.10.2 This model was likely created using llmcompressor's oneshot quantization: ```python from llmcompressor.modifiers.quantization import GPTQModifier from llmcompressor.transformers import oneshot recipe = [ GPTQModifier( targets="Linear", scheme="W8A8", sequential_targets=["Qwen2_5_VLDecoderLayer"], ignore=["lm_head", "re:visual.*"], ), ] ``` If this is more of an llmcompressor-specific model loading issue rather than a transformers compatibility issue, please let me know and I'll file this issue in the llmcompressor repository instead.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39366/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39366/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39365
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39365/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39365/comments
https://api.github.com/repos/huggingface/transformers/issues/39365/events
https://github.com/huggingface/transformers/issues/39365
3,223,215,409
I_kwDOCUB6oc7AHl0x
39,365
Bug in modeling_bart.eager_attention_forward
{ "login": "xadupre", "id": 22452781, "node_id": "MDQ6VXNlcjIyNDUyNzgx", "avatar_url": "https://avatars.githubusercontent.com/u/22452781?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xadupre", "html_url": "https://github.com/xadupre", "followers_url": "https://api.github.com/users/xadupre/followers", "following_url": "https://api.github.com/users/xadupre/following{/other_user}", "gists_url": "https://api.github.com/users/xadupre/gists{/gist_id}", "starred_url": "https://api.github.com/users/xadupre/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xadupre/subscriptions", "organizations_url": "https://api.github.com/users/xadupre/orgs", "repos_url": "https://api.github.com/users/xadupre/repos", "events_url": "https://api.github.com/users/xadupre/events{/privacy}", "received_events_url": "https://api.github.com/users/xadupre/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 2796628563, "node_id": "MDU6TGFiZWwyNzk2NjI4NTYz", "url": "https://api.github.com/repos/huggingface/transformers/labels/WIP", "name": "WIP", "color": "234C99", "default": false, "description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress" }, { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
open
false
{ "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "vasqu", "id": 73884904, "node_id": "MDQ6VXNlcjczODg0OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vasqu", "html_url": "https://github.com/vasqu", "followers_url": "https://api.github.com/users/vasqu/followers", "following_url": "https://api.github.com/users/vasqu/following{/other_user}", "gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}", "starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vasqu/subscriptions", "organizations_url": "https://api.github.com/users/vasqu/orgs", "repos_url": "https://api.github.com/users/vasqu/repos", "events_url": "https://api.github.com/users/vasqu/events{/privacy}", "received_events_url": "https://api.github.com/users/vasqu/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
[]
2025-07-11T14:59:42
2025-09-05T09:27:23
null
CONTRIBUTOR
null
null
null
null
### System Info In [sdpa_forward_attention](https://github.com/huggingface/transformers/blob/main/src/transformers/integrations/sdpa_attention.py#L44), we have this: ```python if attention_mask is not None and attention_mask.ndim == 4: attention_mask = attention_mask[:, :, :, : key.shape[-2]] ``` In [eager_attention_foward](https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/modeling_bart.py#L133), these two lines are not there and it may fail there: ```python if attention_mask is not None: attn_weights = attn_weights + attention_mask ``` I was wondering if that was intended. sdpa seems to be the default implementation. ### Who can help? @ArthurZucker ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Switch default attn_implementation to eager. ### Expected behavior Both implementation should work.
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39365/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39365/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39364
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39364/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39364/comments
https://api.github.com/repos/huggingface/transformers/issues/39364/events
https://github.com/huggingface/transformers/pull/39364
3,223,210,173
PR_kwDOCUB6oc6eg0mL
39,364
Fix inconsistency in SeamlessM4T and SeamlessM4Tv2 docs
{ "login": "clinty", "id": 223406, "node_id": "MDQ6VXNlcjIyMzQwNg==", "avatar_url": "https://avatars.githubusercontent.com/u/223406?v=4", "gravatar_id": "", "url": "https://api.github.com/users/clinty", "html_url": "https://github.com/clinty", "followers_url": "https://api.github.com/users/clinty/followers", "following_url": "https://api.github.com/users/clinty/following{/other_user}", "gists_url": "https://api.github.com/users/clinty/gists{/gist_id}", "starred_url": "https://api.github.com/users/clinty/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/clinty/subscriptions", "organizations_url": "https://api.github.com/users/clinty/orgs", "repos_url": "https://api.github.com/users/clinty/repos", "events_url": "https://api.github.com/users/clinty/events{/privacy}", "received_events_url": "https://api.github.com/users/clinty/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T14:57:59
2025-09-08T19:28:54
2025-09-08T17:01:44
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39364", "html_url": "https://github.com/huggingface/transformers/pull/39364", "diff_url": "https://github.com/huggingface/transformers/pull/39364.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39364.patch", "merged_at": "2025-09-08T17:01:44" }
This should make the documentation consistent both with itself and with the code with respect to the interaction between `generate_speech` and `return_intermediate_token_ids`. @stevhliu @thomwolf
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39364/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39364/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39363
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39363/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39363/comments
https://api.github.com/repos/huggingface/transformers/issues/39363/events
https://github.com/huggingface/transformers/pull/39363
3,223,168,858
PR_kwDOCUB6oc6egrkZ
39,363
Fix overriding Fast Image/Video Processors instance attributes affect other instances
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T14:43:04
2025-07-12T23:39:07
2025-07-12T23:39:06
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39363", "html_url": "https://github.com/huggingface/transformers/pull/39363", "diff_url": "https://github.com/huggingface/transformers/pull/39363.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39363.patch", "merged_at": "2025-07-12T23:39:06" }
# What does this PR do? Fix an issue raised here https://github.com/huggingface/transformers/pull/39172#issuecomment-3052873789 After internal discussions with @ydshieh , we found that the issue is not due to class attributes, but to lists/dicts being passed around without being copied. In fact the same issue is also present in slow image processors, but we don't fix them here, as they all have their own attributes assignment logic, and are to be deprecated. To reproduce the issue: ```python >>> o = CLIPImageProcessorFast() >>> o.image_mean[0] 0.48145466 >>> CLIPImageProcessorFast.image_mean [0.48145466, 0.4578275, 0.40821073] >>> o.image_mean[0] = 1 >>> CLIPImageProcessorFast.image_mean [1, 0.4578275, 0.40821073] >>> ```
{ "login": "yonigozlan", "id": 74535834, "node_id": "MDQ6VXNlcjc0NTM1ODM0", "avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yonigozlan", "html_url": "https://github.com/yonigozlan", "followers_url": "https://api.github.com/users/yonigozlan/followers", "following_url": "https://api.github.com/users/yonigozlan/following{/other_user}", "gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}", "starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions", "organizations_url": "https://api.github.com/users/yonigozlan/orgs", "repos_url": "https://api.github.com/users/yonigozlan/repos", "events_url": "https://api.github.com/users/yonigozlan/events{/privacy}", "received_events_url": "https://api.github.com/users/yonigozlan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39363/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39363/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39362
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39362/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39362/comments
https://api.github.com/repos/huggingface/transformers/issues/39362/events
https://github.com/huggingface/transformers/pull/39362
3,223,028,174
PR_kwDOCUB6oc6egMw5
39,362
Update `GemmaIntegrationTest::test_model_2b_bf16_dola`
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/followers", "following_url": "https://api.github.com/users/ydshieh/following{/other_user}", "gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}", "starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions", "organizations_url": "https://api.github.com/users/ydshieh/orgs", "repos_url": "https://api.github.com/users/ydshieh/repos", "events_url": "https://api.github.com/users/ydshieh/events{/privacy}", "received_events_url": "https://api.github.com/users/ydshieh/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T13:55:47
2025-07-17T13:06:26
2025-07-17T13:06:24
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39362", "html_url": "https://github.com/huggingface/transformers/pull/39362", "diff_url": "https://github.com/huggingface/transformers/pull/39362.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39362.patch", "merged_at": "2025-07-17T13:06:24" }
# What does this PR do? The value is changed after #39120 but remain reasonable. Not sure the root reason why this might be changed though
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39362/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39362/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39361
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39361/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39361/comments
https://api.github.com/repos/huggingface/transformers/issues/39361/events
https://github.com/huggingface/transformers/pull/39361
3,222,990,955
PR_kwDOCUB6oc6egEi7
39,361
update cb TP
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T13:44:19
2025-07-11T14:07:26
2025-07-11T13:54:25
COLLABORATOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39361", "html_url": "https://github.com/huggingface/transformers/pull/39361", "diff_url": "https://github.com/huggingface/transformers/pull/39361.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39361.patch", "merged_at": "2025-07-11T13:54:25" }
# What does this PR do? Following up on #39164 lazy init is not really an option for now so for CB we need to make sure we init the cache properly
{ "login": "ArthurZucker", "id": 48595927, "node_id": "MDQ6VXNlcjQ4NTk1OTI3", "avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ArthurZucker", "html_url": "https://github.com/ArthurZucker", "followers_url": "https://api.github.com/users/ArthurZucker/followers", "following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}", "gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}", "starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions", "organizations_url": "https://api.github.com/users/ArthurZucker/orgs", "repos_url": "https://api.github.com/users/ArthurZucker/repos", "events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}", "received_events_url": "https://api.github.com/users/ArthurZucker/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39361/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39361/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39360
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39360/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39360/comments
https://api.github.com/repos/huggingface/transformers/issues/39360/events
https://github.com/huggingface/transformers/pull/39360
3,222,955,104
PR_kwDOCUB6oc6ef8iX
39,360
Fix link for testpypi
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T13:33:25
2025-07-11T13:46:24
2025-07-11T13:34:01
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39360", "html_url": "https://github.com/huggingface/transformers/pull/39360", "diff_url": "https://github.com/huggingface/transformers/pull/39360.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39360.patch", "merged_at": "2025-07-11T13:34:01" }
# What does this PR do? 2 times facing the link error -> time to update it on main
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39360/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39360/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39359
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39359/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39359/comments
https://api.github.com/repos/huggingface/transformers/issues/39359/events
https://github.com/huggingface/transformers/issues/39359
3,222,862,781
I_kwDOCUB6oc7AGPu9
39,359
vLLM v0.9.2: Qwen2.5-Omni-7B-AWQ fails to load with transformers 4.53.1 (requires 4.52.4)
{ "login": "GreenerZ", "id": 30285532, "node_id": "MDQ6VXNlcjMwMjg1NTMy", "avatar_url": "https://avatars.githubusercontent.com/u/30285532?v=4", "gravatar_id": "", "url": "https://api.github.com/users/GreenerZ", "html_url": "https://github.com/GreenerZ", "followers_url": "https://api.github.com/users/GreenerZ/followers", "following_url": "https://api.github.com/users/GreenerZ/following{/other_user}", "gists_url": "https://api.github.com/users/GreenerZ/gists{/gist_id}", "starred_url": "https://api.github.com/users/GreenerZ/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GreenerZ/subscriptions", "organizations_url": "https://api.github.com/users/GreenerZ/orgs", "repos_url": "https://api.github.com/users/GreenerZ/repos", "events_url": "https://api.github.com/users/GreenerZ/events{/privacy}", "received_events_url": "https://api.github.com/users/GreenerZ/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-11T13:06:41
2025-07-11T13:07:57
2025-07-11T13:07:57
NONE
null
null
null
null
### System Info - `transformers` version: 4.53.1 - Platform: Linux-6.8.0-62-generic-x86_64-with-glibc2.35 - Python version: 3.12.11 - Huggingface_hub version: 0.33.2 - Safetensors version: 0.5.3 - Accelerate version: 1.8.1 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.7.0+cu128 (CUDA) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?/GPU?/TPU?): not installed (NA) - Jax version: not installed - JaxLib version: not installed - Using distributed or parallel set-up in script?: <fill in> - Using GPU in script?: <fill in> - GPU type: NVIDIA GeForce RTX 4090 ### Who can help? When deploying the Qwen2.5-Omni-7B-AWQ model using vLLM v0.9.2 and Transformers v4.53.1, the service fails to start due to a type mismatch in the image processor class. Error Log `The system expects a Qwen2VLImageProcessor, but receives a Qwen2VLImageProcessorFast instead.` This issue is resolved by downgrading transformers to version 4.52.4. ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction image: vllm/vllm-openai:v0.9.2 command: > --model=/models/Qwen/Qwen2.5-Omni-7B-AWQ --tokenizer_mode="auto" --uvicorn-log-level=debug --host=0.0.0.0 --port=8000 --served-model-name=qwen2.5-omni --enable-auto-tool-choice --tool-call-parser hermes --dtype=bfloat16 --trust_remote_code --tensor_parallel_size=1 ### Expected behavior Ideally, the model should be able to load and function correctly without requiring a downgrade of the transformers library.
{ "login": "GreenerZ", "id": 30285532, "node_id": "MDQ6VXNlcjMwMjg1NTMy", "avatar_url": "https://avatars.githubusercontent.com/u/30285532?v=4", "gravatar_id": "", "url": "https://api.github.com/users/GreenerZ", "html_url": "https://github.com/GreenerZ", "followers_url": "https://api.github.com/users/GreenerZ/followers", "following_url": "https://api.github.com/users/GreenerZ/following{/other_user}", "gists_url": "https://api.github.com/users/GreenerZ/gists{/gist_id}", "starred_url": "https://api.github.com/users/GreenerZ/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/GreenerZ/subscriptions", "organizations_url": "https://api.github.com/users/GreenerZ/orgs", "repos_url": "https://api.github.com/users/GreenerZ/repos", "events_url": "https://api.github.com/users/GreenerZ/events{/privacy}", "received_events_url": "https://api.github.com/users/GreenerZ/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39359/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39359/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39358
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39358/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39358/comments
https://api.github.com/repos/huggingface/transformers/issues/39358/events
https://github.com/huggingface/transformers/pull/39358
3,222,476,207
PR_kwDOCUB6oc6eeSRy
39,358
fix(siglip2): improve pooled output logic in modular file
{ "login": "sameerajashyam", "id": 79054143, "node_id": "MDQ6VXNlcjc5MDU0MTQz", "avatar_url": "https://avatars.githubusercontent.com/u/79054143?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sameerajashyam", "html_url": "https://github.com/sameerajashyam", "followers_url": "https://api.github.com/users/sameerajashyam/followers", "following_url": "https://api.github.com/users/sameerajashyam/following{/other_user}", "gists_url": "https://api.github.com/users/sameerajashyam/gists{/gist_id}", "starred_url": "https://api.github.com/users/sameerajashyam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sameerajashyam/subscriptions", "organizations_url": "https://api.github.com/users/sameerajashyam/orgs", "repos_url": "https://api.github.com/users/sameerajashyam/repos", "events_url": "https://api.github.com/users/sameerajashyam/events{/privacy}", "received_events_url": "https://api.github.com/users/sameerajashyam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T10:57:35
2025-07-11T11:41:30
2025-07-11T11:41:16
CONTRIBUTOR
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39358", "html_url": "https://github.com/huggingface/transformers/pull/39358", "diff_url": "https://github.com/huggingface/transformers/pull/39358.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39358.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [x] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [x] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "sameerajashyam", "id": 79054143, "node_id": "MDQ6VXNlcjc5MDU0MTQz", "avatar_url": "https://avatars.githubusercontent.com/u/79054143?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sameerajashyam", "html_url": "https://github.com/sameerajashyam", "followers_url": "https://api.github.com/users/sameerajashyam/followers", "following_url": "https://api.github.com/users/sameerajashyam/following{/other_user}", "gists_url": "https://api.github.com/users/sameerajashyam/gists{/gist_id}", "starred_url": "https://api.github.com/users/sameerajashyam/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sameerajashyam/subscriptions", "organizations_url": "https://api.github.com/users/sameerajashyam/orgs", "repos_url": "https://api.github.com/users/sameerajashyam/repos", "events_url": "https://api.github.com/users/sameerajashyam/events{/privacy}", "received_events_url": "https://api.github.com/users/sameerajashyam/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39358/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39358/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39357
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39357/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39357/comments
https://api.github.com/repos/huggingface/transformers/issues/39357/events
https://github.com/huggingface/transformers/pull/39357
3,222,421,354
PR_kwDOCUB6oc6eeGPX
39,357
Update docstring for glm4v
{ "login": "ddeellttaa", "id": 49911567, "node_id": "MDQ6VXNlcjQ5OTExNTY3", "avatar_url": "https://avatars.githubusercontent.com/u/49911567?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddeellttaa", "html_url": "https://github.com/ddeellttaa", "followers_url": "https://api.github.com/users/ddeellttaa/followers", "following_url": "https://api.github.com/users/ddeellttaa/following{/other_user}", "gists_url": "https://api.github.com/users/ddeellttaa/gists{/gist_id}", "starred_url": "https://api.github.com/users/ddeellttaa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ddeellttaa/subscriptions", "organizations_url": "https://api.github.com/users/ddeellttaa/orgs", "repos_url": "https://api.github.com/users/ddeellttaa/repos", "events_url": "https://api.github.com/users/ddeellttaa/events{/privacy}", "received_events_url": "https://api.github.com/users/ddeellttaa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-07-11T10:36:34
2025-07-14T03:04:48
null
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39357", "html_url": "https://github.com/huggingface/transformers/pull/39357", "diff_url": "https://github.com/huggingface/transformers/pull/39357.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39357.patch", "merged_at": null }
What does this PR do? This PR updates the class docstring example for a multimodal model, addressing the following issues: Corrects the input type: Changes the model input from a list to a tensor, aligning with model requirements. Properly handles image input: Adds the image to the model input for correct multimodal inference. Updates to processor usage: Replaces the use of tokenizer with processor for appropriate multimodal preprocessing. These changes ensure that the documentation provides a runnable and accurate example for users working with multimodal models. ## Before submitting - [✅ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ✅] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ❌] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [no need ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ no need] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39357/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39357/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39356
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39356/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39356/comments
https://api.github.com/repos/huggingface/transformers/issues/39356/events
https://github.com/huggingface/transformers/pull/39356
3,222,387,894
PR_kwDOCUB6oc6ed-1X
39,356
Update modeling_glm4v.py
{ "login": "ddeellttaa", "id": 49911567, "node_id": "MDQ6VXNlcjQ5OTExNTY3", "avatar_url": "https://avatars.githubusercontent.com/u/49911567?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddeellttaa", "html_url": "https://github.com/ddeellttaa", "followers_url": "https://api.github.com/users/ddeellttaa/followers", "following_url": "https://api.github.com/users/ddeellttaa/following{/other_user}", "gists_url": "https://api.github.com/users/ddeellttaa/gists{/gist_id}", "starred_url": "https://api.github.com/users/ddeellttaa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ddeellttaa/subscriptions", "organizations_url": "https://api.github.com/users/ddeellttaa/orgs", "repos_url": "https://api.github.com/users/ddeellttaa/repos", "events_url": "https://api.github.com/users/ddeellttaa/events{/privacy}", "received_events_url": "https://api.github.com/users/ddeellttaa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T10:23:33
2025-07-11T10:24:40
2025-07-11T10:24:40
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39356", "html_url": "https://github.com/huggingface/transformers/pull/39356", "diff_url": "https://github.com/huggingface/transformers/pull/39356.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39356.patch", "merged_at": null }
Fix example: resolve input, image, and processor usage issues - Changed input from list to tensor to match model requirements. - Added image to model input to enable correct multimodal inference. - Replaced tokenizer with processor for proper multimodal preprocessing.
{ "login": "ddeellttaa", "id": 49911567, "node_id": "MDQ6VXNlcjQ5OTExNTY3", "avatar_url": "https://avatars.githubusercontent.com/u/49911567?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddeellttaa", "html_url": "https://github.com/ddeellttaa", "followers_url": "https://api.github.com/users/ddeellttaa/followers", "following_url": "https://api.github.com/users/ddeellttaa/following{/other_user}", "gists_url": "https://api.github.com/users/ddeellttaa/gists{/gist_id}", "starred_url": "https://api.github.com/users/ddeellttaa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ddeellttaa/subscriptions", "organizations_url": "https://api.github.com/users/ddeellttaa/orgs", "repos_url": "https://api.github.com/users/ddeellttaa/repos", "events_url": "https://api.github.com/users/ddeellttaa/events{/privacy}", "received_events_url": "https://api.github.com/users/ddeellttaa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39356/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39356/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39355
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39355/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39355/comments
https://api.github.com/repos/huggingface/transformers/issues/39355/events
https://github.com/huggingface/transformers/pull/39355
3,222,351,369
PR_kwDOCUB6oc6ed2uw
39,355
Update modular_glm4v.py
{ "login": "ddeellttaa", "id": 49911567, "node_id": "MDQ6VXNlcjQ5OTExNTY3", "avatar_url": "https://avatars.githubusercontent.com/u/49911567?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddeellttaa", "html_url": "https://github.com/ddeellttaa", "followers_url": "https://api.github.com/users/ddeellttaa/followers", "following_url": "https://api.github.com/users/ddeellttaa/following{/other_user}", "gists_url": "https://api.github.com/users/ddeellttaa/gists{/gist_id}", "starred_url": "https://api.github.com/users/ddeellttaa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ddeellttaa/subscriptions", "organizations_url": "https://api.github.com/users/ddeellttaa/orgs", "repos_url": "https://api.github.com/users/ddeellttaa/repos", "events_url": "https://api.github.com/users/ddeellttaa/events{/privacy}", "received_events_url": "https://api.github.com/users/ddeellttaa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T10:10:35
2025-07-11T10:16:15
2025-07-11T10:16:15
NONE
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39355", "html_url": "https://github.com/huggingface/transformers/pull/39355", "diff_url": "https://github.com/huggingface/transformers/pull/39355.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39355.patch", "merged_at": null }
Fix example: input should be tensor, and image was missing in model input - Correct input type from list to tensor. - Ensure image is properly passed to the model. # What does this PR do? This PR fixes two issues in the example code provided in the class docstring: Input Type Correction: The example in the docstring previously showed passing a list as input to the model, which is incorrect because the model expects a tensor. This has been updated to use a tensor as input. Image Input Handling: The docstring example did not show how to properly pass the image to the model, which is required for correct multimodal inference. This has now been added. Motivation and Context These changes update the class documentation to provide a correct and functional usage example for multimodal inference, helping users avoid confusion and errors when referring to the docstring. Fixes # (issue) Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "ddeellttaa", "id": 49911567, "node_id": "MDQ6VXNlcjQ5OTExNTY3", "avatar_url": "https://avatars.githubusercontent.com/u/49911567?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddeellttaa", "html_url": "https://github.com/ddeellttaa", "followers_url": "https://api.github.com/users/ddeellttaa/followers", "following_url": "https://api.github.com/users/ddeellttaa/following{/other_user}", "gists_url": "https://api.github.com/users/ddeellttaa/gists{/gist_id}", "starred_url": "https://api.github.com/users/ddeellttaa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ddeellttaa/subscriptions", "organizations_url": "https://api.github.com/users/ddeellttaa/orgs", "repos_url": "https://api.github.com/users/ddeellttaa/repos", "events_url": "https://api.github.com/users/ddeellttaa/events{/privacy}", "received_events_url": "https://api.github.com/users/ddeellttaa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39355/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39355/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39354
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39354/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39354/comments
https://api.github.com/repos/huggingface/transformers/issues/39354/events
https://github.com/huggingface/transformers/pull/39354
3,222,261,313
PR_kwDOCUB6oc6edi79
39,354
create ijepa modelcard (ref : PR #36979 ).
{ "login": "dhruvmalik007", "id": 29146213, "node_id": "MDQ6VXNlcjI5MTQ2MjEz", "avatar_url": "https://avatars.githubusercontent.com/u/29146213?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhruvmalik007", "html_url": "https://github.com/dhruvmalik007", "followers_url": "https://api.github.com/users/dhruvmalik007/followers", "following_url": "https://api.github.com/users/dhruvmalik007/following{/other_user}", "gists_url": "https://api.github.com/users/dhruvmalik007/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhruvmalik007/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhruvmalik007/subscriptions", "organizations_url": "https://api.github.com/users/dhruvmalik007/orgs", "repos_url": "https://api.github.com/users/dhruvmalik007/repos", "events_url": "https://api.github.com/users/dhruvmalik007/events{/privacy}", "received_events_url": "https://api.github.com/users/dhruvmalik007/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T09:37:58
2025-07-16T19:40:22
2025-07-16T19:40:22
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39354", "html_url": "https://github.com/huggingface/transformers/pull/39354", "diff_url": "https://github.com/huggingface/transformers/pull/39354.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39354.patch", "merged_at": "2025-07-16T19:40:22" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> #36979 : adds the descriptive model card based on the latest standard. ## Before submitting - [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case - [X] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? @stevhliu
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39354/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39354/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39353
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39353/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39353/comments
https://api.github.com/repos/huggingface/transformers/issues/39353/events
https://github.com/huggingface/transformers/pull/39353
3,222,256,572
PR_kwDOCUB6oc6edh5q
39,353
fix colpali mapping
{ "login": "ManuelFay", "id": 43467008, "node_id": "MDQ6VXNlcjQzNDY3MDA4", "avatar_url": "https://avatars.githubusercontent.com/u/43467008?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ManuelFay", "html_url": "https://github.com/ManuelFay", "followers_url": "https://api.github.com/users/ManuelFay/followers", "following_url": "https://api.github.com/users/ManuelFay/following{/other_user}", "gists_url": "https://api.github.com/users/ManuelFay/gists{/gist_id}", "starred_url": "https://api.github.com/users/ManuelFay/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ManuelFay/subscriptions", "organizations_url": "https://api.github.com/users/ManuelFay/orgs", "repos_url": "https://api.github.com/users/ManuelFay/repos", "events_url": "https://api.github.com/users/ManuelFay/events{/privacy}", "received_events_url": "https://api.github.com/users/ManuelFay/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
null
[]
null
[]
2025-07-11T09:36:06
2025-07-14T06:32:53
null
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39353", "html_url": "https://github.com/huggingface/transformers/pull/39353", "diff_url": "https://github.com/huggingface/transformers/pull/39353.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39353.patch", "merged_at": null }
# What does this PR do? Fixes # (issue) Fixes the issue of Colpali not working sincer the breaking changes with VLMs in 4.52. Most models were converted, not the native HF colpali making the code in docs here fail: https://huggingface.co/docs/transformers/model_doc/colpali. @yonigozlan @zucchini-nlp
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39353/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39353/timeline
null
null
null
null
true
false
https://api.github.com/repos/huggingface/transformers/issues/39352
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39352/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39352/comments
https://api.github.com/repos/huggingface/transformers/issues/39352/events
https://github.com/huggingface/transformers/issues/39352
3,222,223,472
I_kwDOCUB6oc7ADzpw
39,352
env.useBrowserCache = true causes JSON parsing error, forced to disable cache making app slower.
{ "login": "piyushh-k", "id": 180021205, "node_id": "U_kgDOCrrn1Q", "avatar_url": "https://avatars.githubusercontent.com/u/180021205?v=4", "gravatar_id": "", "url": "https://api.github.com/users/piyushh-k", "html_url": "https://github.com/piyushh-k", "followers_url": "https://api.github.com/users/piyushh-k/followers", "following_url": "https://api.github.com/users/piyushh-k/following{/other_user}", "gists_url": "https://api.github.com/users/piyushh-k/gists{/gist_id}", "starred_url": "https://api.github.com/users/piyushh-k/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/piyushh-k/subscriptions", "organizations_url": "https://api.github.com/users/piyushh-k/orgs", "repos_url": "https://api.github.com/users/piyushh-k/repos", "events_url": "https://api.github.com/users/piyushh-k/events{/privacy}", "received_events_url": "https://api.github.com/users/piyushh-k/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T09:23:53
2025-09-15T08:03:33
2025-09-15T08:03:33
NONE
null
null
null
null
Hey! 👋 I'm using @xenova/transformers in a browser-based transcription app. I noticed that when I set: env.useBrowserCache = true , it introduces multiple bugs : the main one being a JSON parsing error during or after model loading. Due to this, I have to set env.useBrowserCache = false; which fixes the errors, but it causes a major performance hit since the model needs to be re-downloaded every time the page reloads or the transcription restarts.
{ "login": "github-actions[bot]", "id": 41898282, "node_id": "MDM6Qm90NDE4OTgyODI=", "avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/github-actions%5Bbot%5D", "html_url": "https://github.com/apps/github-actions", "followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers", "following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}", "gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}", "starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions", "organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs", "repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos", "events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}", "received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events", "type": "Bot", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39352/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39352/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true
https://api.github.com/repos/huggingface/transformers/issues/39351
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39351/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39351/comments
https://api.github.com/repos/huggingface/transformers/issues/39351/events
https://github.com/huggingface/transformers/pull/39351
3,222,199,935
PR_kwDOCUB6oc6edVad
39,351
fix gpt2 usage doc
{ "login": "Xiang-cd", "id": 66510463, "node_id": "MDQ6VXNlcjY2NTEwNDYz", "avatar_url": "https://avatars.githubusercontent.com/u/66510463?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Xiang-cd", "html_url": "https://github.com/Xiang-cd", "followers_url": "https://api.github.com/users/Xiang-cd/followers", "following_url": "https://api.github.com/users/Xiang-cd/following{/other_user}", "gists_url": "https://api.github.com/users/Xiang-cd/gists{/gist_id}", "starred_url": "https://api.github.com/users/Xiang-cd/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Xiang-cd/subscriptions", "organizations_url": "https://api.github.com/users/Xiang-cd/orgs", "repos_url": "https://api.github.com/users/Xiang-cd/repos", "events_url": "https://api.github.com/users/Xiang-cd/events{/privacy}", "received_events_url": "https://api.github.com/users/Xiang-cd/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T09:17:00
2025-07-11T17:59:41
2025-07-11T17:59:41
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39351", "html_url": "https://github.com/huggingface/transformers/pull/39351", "diff_url": "https://github.com/huggingface/transformers/pull/39351.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39351.patch", "merged_at": "2025-07-11T17:59:41" }
# What does this PR do? Fixes # (issue) gpt2 usage doc can not execute rightly @stevhliu ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39351/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39351/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39350
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39350/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39350/comments
https://api.github.com/repos/huggingface/transformers/issues/39350/events
https://github.com/huggingface/transformers/issues/39350
3,222,166,666
I_kwDOCUB6oc7ADlyK
39,350
surpport for google/medgemma-27b-it
{ "login": "moshilangzi", "id": 22696706, "node_id": "MDQ6VXNlcjIyNjk2NzA2", "avatar_url": "https://avatars.githubusercontent.com/u/22696706?v=4", "gravatar_id": "", "url": "https://api.github.com/users/moshilangzi", "html_url": "https://github.com/moshilangzi", "followers_url": "https://api.github.com/users/moshilangzi/followers", "following_url": "https://api.github.com/users/moshilangzi/following{/other_user}", "gists_url": "https://api.github.com/users/moshilangzi/gists{/gist_id}", "starred_url": "https://api.github.com/users/moshilangzi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/moshilangzi/subscriptions", "organizations_url": "https://api.github.com/users/moshilangzi/orgs", "repos_url": "https://api.github.com/users/moshilangzi/repos", "events_url": "https://api.github.com/users/moshilangzi/events{/privacy}", "received_events_url": "https://api.github.com/users/moshilangzi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
open
false
null
[]
null
[]
2025-07-11T09:06:45
2025-07-11T11:09:02
null
NONE
null
null
null
null
### Model description surpport for google/medgemma-27b-it ### Open source status - [x] The model implementation is available - [x] The model weights are available ### Provide useful links for the implementation surpport for google/medgemma-27b-it
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39350/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39350/timeline
null
null
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
false
https://api.github.com/repos/huggingface/transformers/issues/39349
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39349/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39349/comments
https://api.github.com/repos/huggingface/transformers/issues/39349/events
https://github.com/huggingface/transformers/pull/39349
3,222,086,300
PR_kwDOCUB6oc6ec8R1
39,349
Fix the issue that csm model cannot work with pipeline mode.
{ "login": "yuanwu2017", "id": 34643241, "node_id": "MDQ6VXNlcjM0NjQzMjQx", "avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuanwu2017", "html_url": "https://github.com/yuanwu2017", "followers_url": "https://api.github.com/users/yuanwu2017/followers", "following_url": "https://api.github.com/users/yuanwu2017/following{/other_user}", "gists_url": "https://api.github.com/users/yuanwu2017/gists{/gist_id}", "starred_url": "https://api.github.com/users/yuanwu2017/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yuanwu2017/subscriptions", "organizations_url": "https://api.github.com/users/yuanwu2017/orgs", "repos_url": "https://api.github.com/users/yuanwu2017/repos", "events_url": "https://api.github.com/users/yuanwu2017/events{/privacy}", "received_events_url": "https://api.github.com/users/yuanwu2017/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T08:40:20
2025-09-10T16:17:36
2025-09-10T16:17:35
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39349", "html_url": "https://github.com/huggingface/transformers/pull/39349", "diff_url": "https://github.com/huggingface/transformers/pull/39349.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39349.patch", "merged_at": "2025-09-10T16:17:35" }
# What does this PR do? Fix the csm model(text_to_audio) cannot work with pipeline mode. example: ``` import torch from transformers import pipeline device = "hpu" if torch.cuda.is_available(): device="cuda" print(f"device: {device}") tts_pipeline = pipeline( "text-to-audio", model="sesame/csm-1b", device=device ) generate_kwargs = { "output_audio": True, } text = "[0]Hello from Sesame." audio_output = tts_pipeline(text, generate_kwargs=generate_kwargs) print(f"audio_output: {audio_output}") audio = audio_output["audio"] if isinstance(audio_output["audio"], torch.Tensor) and len(audio_output["audio"].shape) == 1 else audio_output["audio"][0] import soundfile as sf sf.write("example_with_pipeline.wav", audio, samplerate=audio_output["sampling_rate"]) ``` Before patch: <img width="980" height="194" alt="image" src="https://github.com/user-attachments/assets/fdfcebfe-1c9f-4d16-8a7e-d983d17e3b24" /> After patch: <img width="1036" height="73" alt="image" src="https://github.com/user-attachments/assets/525c7bc1-fc2a-4168-8fb5-2fda3fe1d875" /> Batch inference with original example: ``` import torch from transformers import CsmForConditionalGeneration, AutoProcessor model_id = "sesame/csm-1b" device="hpu" if torch.cuda.is_available(): device="cuda" # load the model and the processor processor = AutoProcessor.from_pretrained(model_id) model = CsmForConditionalGeneration.from_pretrained(model_id, device_map=device) # prepare the inputs text = ["[0]Hello from Sesame.", "[1]Open sesame."] # `[0]` for speaker id 0 inputs = processor(text, add_special_tokens=True).to(device) audio = model.generate(**inputs, output_audio=True) print(f"audio:{audio}") processor.save_audio(audio, ["1.wav", "2.wav"]) ``` Result: <img width="760" height="59" alt="image" src="https://github.com/user-attachments/assets/b2cca836-ff96-4a0e-9e81-209a83bb6b24" /> <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "eustlb", "id": 94853470, "node_id": "U_kgDOBadZXg", "avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eustlb", "html_url": "https://github.com/eustlb", "followers_url": "https://api.github.com/users/eustlb/followers", "following_url": "https://api.github.com/users/eustlb/following{/other_user}", "gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}", "starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eustlb/subscriptions", "organizations_url": "https://api.github.com/users/eustlb/orgs", "repos_url": "https://api.github.com/users/eustlb/repos", "events_url": "https://api.github.com/users/eustlb/events{/privacy}", "received_events_url": "https://api.github.com/users/eustlb/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39349/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39349/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39348
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39348/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39348/comments
https://api.github.com/repos/huggingface/transformers/issues/39348/events
https://github.com/huggingface/transformers/pull/39348
3,221,671,336
PR_kwDOCUB6oc6ebjwR
39,348
[shieldgemma] fix checkpoint loading
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T06:05:12
2025-07-14T06:34:58
2025-07-14T06:34:58
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39348", "html_url": "https://github.com/huggingface/transformers/pull/39348", "diff_url": "https://github.com/huggingface/transformers/pull/39348.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39348.patch", "merged_at": "2025-07-14T06:34:58" }
# What does this PR do? As per title, reported by @ydshieh internally (how many more models we have that load a VLM internally 🥲 )
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39348/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39348/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39347
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39347/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39347/comments
https://api.github.com/repos/huggingface/transformers/issues/39347/events
https://github.com/huggingface/transformers/pull/39347
3,221,575,433
PR_kwDOCUB6oc6ebOiw
39,347
[Models][LFM2] LFM2 name remap
{ "login": "paulpak58", "id": 52512091, "node_id": "MDQ6VXNlcjUyNTEyMDkx", "avatar_url": "https://avatars.githubusercontent.com/u/52512091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/paulpak58", "html_url": "https://github.com/paulpak58", "followers_url": "https://api.github.com/users/paulpak58/followers", "following_url": "https://api.github.com/users/paulpak58/following{/other_user}", "gists_url": "https://api.github.com/users/paulpak58/gists{/gist_id}", "starred_url": "https://api.github.com/users/paulpak58/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/paulpak58/subscriptions", "organizations_url": "https://api.github.com/users/paulpak58/orgs", "repos_url": "https://api.github.com/users/paulpak58/repos", "events_url": "https://api.github.com/users/paulpak58/events{/privacy}", "received_events_url": "https://api.github.com/users/paulpak58/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T05:25:39
2025-07-11T15:28:46
2025-07-11T15:28:46
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39347", "html_url": "https://github.com/huggingface/transformers/pull/39347", "diff_url": "https://github.com/huggingface/transformers/pull/39347.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39347.patch", "merged_at": null }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "paulpak58", "id": 52512091, "node_id": "MDQ6VXNlcjUyNTEyMDkx", "avatar_url": "https://avatars.githubusercontent.com/u/52512091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/paulpak58", "html_url": "https://github.com/paulpak58", "followers_url": "https://api.github.com/users/paulpak58/followers", "following_url": "https://api.github.com/users/paulpak58/following{/other_user}", "gists_url": "https://api.github.com/users/paulpak58/gists{/gist_id}", "starred_url": "https://api.github.com/users/paulpak58/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/paulpak58/subscriptions", "organizations_url": "https://api.github.com/users/paulpak58/orgs", "repos_url": "https://api.github.com/users/paulpak58/repos", "events_url": "https://api.github.com/users/paulpak58/events{/privacy}", "received_events_url": "https://api.github.com/users/paulpak58/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39347/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39347/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39346
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39346/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39346/comments
https://api.github.com/repos/huggingface/transformers/issues/39346/events
https://github.com/huggingface/transformers/pull/39346
3,221,431,884
PR_kwDOCUB6oc6eavJr
39,346
Use np.pad instead of np.lib.pad.
{ "login": "rasmi", "id": 2267370, "node_id": "MDQ6VXNlcjIyNjczNzA=", "avatar_url": "https://avatars.githubusercontent.com/u/2267370?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rasmi", "html_url": "https://github.com/rasmi", "followers_url": "https://api.github.com/users/rasmi/followers", "following_url": "https://api.github.com/users/rasmi/following{/other_user}", "gists_url": "https://api.github.com/users/rasmi/gists{/gist_id}", "starred_url": "https://api.github.com/users/rasmi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rasmi/subscriptions", "organizations_url": "https://api.github.com/users/rasmi/orgs", "repos_url": "https://api.github.com/users/rasmi/repos", "events_url": "https://api.github.com/users/rasmi/events{/privacy}", "received_events_url": "https://api.github.com/users/rasmi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-11T04:03:42
2025-07-14T18:47:24
2025-07-14T16:05:28
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39346", "html_url": "https://github.com/huggingface/transformers/pull/39346", "diff_url": "https://github.com/huggingface/transformers/pull/39346.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39346.patch", "merged_at": "2025-07-14T16:05:28" }
# What does this PR do? `audio_utils.py` still uses `np.lib.pad` from numpy 1.X rather than `np.pad`. This fixes the single remaining call in `transformers` to use `np.pad`. ## Who can review? cc: @ArthurZucker, @Rocketknight1
{ "login": "Rocketknight1", "id": 12866554, "node_id": "MDQ6VXNlcjEyODY2NTU0", "avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Rocketknight1", "html_url": "https://github.com/Rocketknight1", "followers_url": "https://api.github.com/users/Rocketknight1/followers", "following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}", "gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}", "starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions", "organizations_url": "https://api.github.com/users/Rocketknight1/orgs", "repos_url": "https://api.github.com/users/Rocketknight1/repos", "events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}", "received_events_url": "https://api.github.com/users/Rocketknight1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39346/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39346/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39345
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39345/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39345/comments
https://api.github.com/repos/huggingface/transformers/issues/39345/events
https://github.com/huggingface/transformers/pull/39345
3,220,592,314
PR_kwDOCUB6oc6eX49I
39,345
Trainer docs typo
{ "login": "jannisborn", "id": 15703818, "node_id": "MDQ6VXNlcjE1NzAzODE4", "avatar_url": "https://avatars.githubusercontent.com/u/15703818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jannisborn", "html_url": "https://github.com/jannisborn", "followers_url": "https://api.github.com/users/jannisborn/followers", "following_url": "https://api.github.com/users/jannisborn/following{/other_user}", "gists_url": "https://api.github.com/users/jannisborn/gists{/gist_id}", "starred_url": "https://api.github.com/users/jannisborn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jannisborn/subscriptions", "organizations_url": "https://api.github.com/users/jannisborn/orgs", "repos_url": "https://api.github.com/users/jannisborn/repos", "events_url": "https://api.github.com/users/jannisborn/events{/privacy}", "received_events_url": "https://api.github.com/users/jannisborn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T20:23:01
2025-07-23T18:58:09
2025-07-23T18:58:09
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39345", "html_url": "https://github.com/huggingface/transformers/pull/39345", "diff_url": "https://github.com/huggingface/transformers/pull/39345.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39345.patch", "merged_at": null }
# What does this PR do? Fixes a typo in the docs <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Introduced in #38728, see https://github.com/huggingface/transformers/pull/38728/files ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "jannisborn", "id": 15703818, "node_id": "MDQ6VXNlcjE1NzAzODE4", "avatar_url": "https://avatars.githubusercontent.com/u/15703818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jannisborn", "html_url": "https://github.com/jannisborn", "followers_url": "https://api.github.com/users/jannisborn/followers", "following_url": "https://api.github.com/users/jannisborn/following{/other_user}", "gists_url": "https://api.github.com/users/jannisborn/gists{/gist_id}", "starred_url": "https://api.github.com/users/jannisborn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jannisborn/subscriptions", "organizations_url": "https://api.github.com/users/jannisborn/orgs", "repos_url": "https://api.github.com/users/jannisborn/repos", "events_url": "https://api.github.com/users/jannisborn/events{/privacy}", "received_events_url": "https://api.github.com/users/jannisborn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39345/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39345/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39344
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39344/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39344/comments
https://api.github.com/repos/huggingface/transformers/issues/39344/events
https://github.com/huggingface/transformers/pull/39344
3,219,927,673
PR_kwDOCUB6oc6eVoyA
39,344
Update OLMoE model card
{ "login": "nlhmnlhmnlhm", "id": 61977480, "node_id": "MDQ6VXNlcjYxOTc3NDgw", "avatar_url": "https://avatars.githubusercontent.com/u/61977480?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nlhmnlhmnlhm", "html_url": "https://github.com/nlhmnlhmnlhm", "followers_url": "https://api.github.com/users/nlhmnlhmnlhm/followers", "following_url": "https://api.github.com/users/nlhmnlhmnlhm/following{/other_user}", "gists_url": "https://api.github.com/users/nlhmnlhmnlhm/gists{/gist_id}", "starred_url": "https://api.github.com/users/nlhmnlhmnlhm/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nlhmnlhmnlhm/subscriptions", "organizations_url": "https://api.github.com/users/nlhmnlhmnlhm/orgs", "repos_url": "https://api.github.com/users/nlhmnlhmnlhm/repos", "events_url": "https://api.github.com/users/nlhmnlhmnlhm/events{/privacy}", "received_events_url": "https://api.github.com/users/nlhmnlhmnlhm/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T16:09:41
2025-07-26T07:27:40
2025-07-21T23:41:01
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39344", "html_url": "https://github.com/huggingface/transformers/pull/39344", "diff_url": "https://github.com/huggingface/transformers/pull/39344.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39344.patch", "merged_at": "2025-07-21T23:41:01" }
# What does this PR do? Update OLMoE card Unfortunately my computer do not have enough power to run the code and Colab too, making it a draft until I try a smaller OLMoE. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case : https://github.com/huggingface/transformers/issues/36979. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR.
{ "login": "stevhliu", "id": 59462357, "node_id": "MDQ6VXNlcjU5NDYyMzU3", "avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stevhliu", "html_url": "https://github.com/stevhliu", "followers_url": "https://api.github.com/users/stevhliu/followers", "following_url": "https://api.github.com/users/stevhliu/following{/other_user}", "gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}", "starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions", "organizations_url": "https://api.github.com/users/stevhliu/orgs", "repos_url": "https://api.github.com/users/stevhliu/repos", "events_url": "https://api.github.com/users/stevhliu/events{/privacy}", "received_events_url": "https://api.github.com/users/stevhliu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39344/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39344/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39343
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39343/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39343/comments
https://api.github.com/repos/huggingface/transformers/issues/39343/events
https://github.com/huggingface/transformers/pull/39343
3,219,798,098
PR_kwDOCUB6oc6eVMRh
39,343
[tests] tag serve tests as slow
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T15:27:18
2025-07-10T15:44:21
2025-07-10T15:40:08
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39343", "html_url": "https://github.com/huggingface/transformers/pull/39343", "diff_url": "https://github.com/huggingface/transformers/pull/39343.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39343.patch", "merged_at": "2025-07-10T15:40:08" }
# What does this PR do? These tests are not slow and don't require GPUs, but they're being flaky on the CI setup. This PR tags them as slow to unblock CI.
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39343/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39343/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39342
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39342/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39342/comments
https://api.github.com/repos/huggingface/transformers/issues/39342/events
https://github.com/huggingface/transformers/pull/39342
3,219,627,241
PR_kwDOCUB6oc6eUmYX
39,342
[modeling][lfm2] LFM2: Remove deprecated seen_tokens
{ "login": "paulpak58", "id": 52512091, "node_id": "MDQ6VXNlcjUyNTEyMDkx", "avatar_url": "https://avatars.githubusercontent.com/u/52512091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/paulpak58", "html_url": "https://github.com/paulpak58", "followers_url": "https://api.github.com/users/paulpak58/followers", "following_url": "https://api.github.com/users/paulpak58/following{/other_user}", "gists_url": "https://api.github.com/users/paulpak58/gists{/gist_id}", "starred_url": "https://api.github.com/users/paulpak58/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/paulpak58/subscriptions", "organizations_url": "https://api.github.com/users/paulpak58/orgs", "repos_url": "https://api.github.com/users/paulpak58/repos", "events_url": "https://api.github.com/users/paulpak58/events{/privacy}", "received_events_url": "https://api.github.com/users/paulpak58/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T14:39:00
2025-07-10T15:27:56
2025-07-10T15:27:56
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39342", "html_url": "https://github.com/huggingface/transformers/pull/39342", "diff_url": "https://github.com/huggingface/transformers/pull/39342.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39342.patch", "merged_at": "2025-07-10T15:27:56" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39342/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39342/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39341
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39341/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39341/comments
https://api.github.com/repos/huggingface/transformers/issues/39341/events
https://github.com/huggingface/transformers/pull/39341
3,219,499,665
PR_kwDOCUB6oc6eUKIb
39,341
fix max_length calculating using cu_seq_lens
{ "login": "KKZ20", "id": 46569774, "node_id": "MDQ6VXNlcjQ2NTY5Nzc0", "avatar_url": "https://avatars.githubusercontent.com/u/46569774?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KKZ20", "html_url": "https://github.com/KKZ20", "followers_url": "https://api.github.com/users/KKZ20/followers", "following_url": "https://api.github.com/users/KKZ20/following{/other_user}", "gists_url": "https://api.github.com/users/KKZ20/gists{/gist_id}", "starred_url": "https://api.github.com/users/KKZ20/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/KKZ20/subscriptions", "organizations_url": "https://api.github.com/users/KKZ20/orgs", "repos_url": "https://api.github.com/users/KKZ20/repos", "events_url": "https://api.github.com/users/KKZ20/events{/privacy}", "received_events_url": "https://api.github.com/users/KKZ20/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T14:03:06
2025-07-17T08:59:51
2025-07-17T08:54:23
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39341", "html_url": "https://github.com/huggingface/transformers/pull/39341", "diff_url": "https://github.com/huggingface/transformers/pull/39341.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39341.patch", "merged_at": "2025-07-17T08:54:23" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "zucchini-nlp", "id": 100715397, "node_id": "U_kgDOBgDLhQ", "avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zucchini-nlp", "html_url": "https://github.com/zucchini-nlp", "followers_url": "https://api.github.com/users/zucchini-nlp/followers", "following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}", "gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}", "starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions", "organizations_url": "https://api.github.com/users/zucchini-nlp/orgs", "repos_url": "https://api.github.com/users/zucchini-nlp/repos", "events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}", "received_events_url": "https://api.github.com/users/zucchini-nlp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39341/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39341/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39340
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39340/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39340/comments
https://api.github.com/repos/huggingface/transformers/issues/39340/events
https://github.com/huggingface/transformers/pull/39340
3,219,418,017
PR_kwDOCUB6oc6eT4FT
39,340
LFM2
{ "login": "paulpak58", "id": 52512091, "node_id": "MDQ6VXNlcjUyNTEyMDkx", "avatar_url": "https://avatars.githubusercontent.com/u/52512091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/paulpak58", "html_url": "https://github.com/paulpak58", "followers_url": "https://api.github.com/users/paulpak58/followers", "following_url": "https://api.github.com/users/paulpak58/following{/other_user}", "gists_url": "https://api.github.com/users/paulpak58/gists{/gist_id}", "starred_url": "https://api.github.com/users/paulpak58/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/paulpak58/subscriptions", "organizations_url": "https://api.github.com/users/paulpak58/orgs", "repos_url": "https://api.github.com/users/paulpak58/repos", "events_url": "https://api.github.com/users/paulpak58/events{/privacy}", "received_events_url": "https://api.github.com/users/paulpak58/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T13:41:10
2025-07-11T15:14:09
2025-07-10T14:07:34
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39340", "html_url": "https://github.com/huggingface/transformers/pull/39340", "diff_url": "https://github.com/huggingface/transformers/pull/39340.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39340.patch", "merged_at": "2025-07-10T14:07:33" }
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change. Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost. --> <!-- Remove if not applicable --> Fixes # (issue) ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section? - [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case. - [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). - [ ] Did you write any new necessary tests? ## Who can review? Anyone in the community is free to review the PR once the tests have passed. Feel free to tag members/contributors who may be interested in your PR. <!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @ If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**. Please tag fewer than 3 people. Models: - text models: @ArthurZucker - vision models: @amyeroberts, @qubvel - speech models: @eustlb - graph models: @clefourrier Library: - flax: @gante and @Rocketknight1 - generate: @zucchini-nlp (visual-language models) or @gante (all others) - pipelines: @Rocketknight1 - tensorflow: @gante and @Rocketknight1 - tokenizers: @ArthurZucker - trainer: @zach-huggingface, @SunMarc and @qgallouedec - chat templates: @Rocketknight1 Integrations: - deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface - ray/raytune: @richardliaw, @amogkam - Big Model Inference: @SunMarc - quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber Documentation: @stevhliu HF projects: - accelerate: [different repo](https://github.com/huggingface/accelerate) - datasets: [different repo](https://github.com/huggingface/datasets) - diffusers: [different repo](https://github.com/huggingface/diffusers) - rust tokenizers: [different repo](https://github.com/huggingface/tokenizers) Maintained examples (not research project or legacy): - Flax: @Rocketknight1 - PyTorch: See Models above and tag the person corresponding to the modality of the example. - TensorFlow: @Rocketknight1 -->
{ "login": "Cyrilvallez", "id": 71554963, "node_id": "MDQ6VXNlcjcxNTU0OTYz", "avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cyrilvallez", "html_url": "https://github.com/Cyrilvallez", "followers_url": "https://api.github.com/users/Cyrilvallez/followers", "following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}", "gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}", "starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions", "organizations_url": "https://api.github.com/users/Cyrilvallez/orgs", "repos_url": "https://api.github.com/users/Cyrilvallez/repos", "events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}", "received_events_url": "https://api.github.com/users/Cyrilvallez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39340/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39340/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39339
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39339/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39339/comments
https://api.github.com/repos/huggingface/transformers/issues/39339/events
https://github.com/huggingface/transformers/pull/39339
3,219,348,555
PR_kwDOCUB6oc6eTo8M
39,339
Refactor embedding input/output getter/setter
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 1834056761, "node_id": "MDU6TGFiZWwxODM0MDU2NzYx", "url": "https://api.github.com/repos/huggingface/transformers/labels/Core:%20Modeling", "name": "Core: Modeling", "color": "FF8446", "default": false, "description": "Internals of the library; Models." } ]
closed
false
null
[]
null
[]
2025-07-10T13:19:22
2025-07-21T16:18:16
2025-07-21T16:18:14
CONTRIBUTOR
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39339", "html_url": "https://github.com/huggingface/transformers/pull/39339", "diff_url": "https://github.com/huggingface/transformers/pull/39339.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39339.patch", "merged_at": "2025-07-21T16:18:14" }
# What does this PR do? **TL;DR** `PreTrainedModel` now mixes in `EmbeddingAccessMixin` providing default `get_input_embeddings` / `set_input_embeddings` / `get_output_embeddings` / `set_output_embeddings` for all models. These methods should be removed from the codebase unless exceptionally weird. ## Details Uses a new attribute `__input_embed_layer = "embed_tokens"` by default, and one can change it to set which layer the embeddings should be gotten/set to. Then, assuming `embed_tokens` is that layer for instance, resolution order is - self.model.embed_tokens - self.embed_tokens - delegate once via base_model_prefix - else raise (model must override) get_output_embeddings now auto-returns lm_head only if input embeddings resolve (so pure audio/vision backbones still return None). What you usually have to do: **nothing**. Override only if: - embeddings live under a different attribute - multiple embedding tables - you need to hide an lm_head (override get_output_embeddings retunring None) - helper head without token embeddings but still wants tying (override and return the head) Potential breakages: - Exotic layouts with custom embedding attr names (must override) cc @vasqu @zucchini-nlp, minor but to remember for composite models too
{ "login": "molbap", "id": 39954772, "node_id": "MDQ6VXNlcjM5OTU0Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/molbap", "html_url": "https://github.com/molbap", "followers_url": "https://api.github.com/users/molbap/followers", "following_url": "https://api.github.com/users/molbap/following{/other_user}", "gists_url": "https://api.github.com/users/molbap/gists{/gist_id}", "starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/molbap/subscriptions", "organizations_url": "https://api.github.com/users/molbap/orgs", "repos_url": "https://api.github.com/users/molbap/repos", "events_url": "https://api.github.com/users/molbap/events{/privacy}", "received_events_url": "https://api.github.com/users/molbap/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39339/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 2, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39339/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39338
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39338/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39338/comments
https://api.github.com/repos/huggingface/transformers/issues/39338/events
https://github.com/huggingface/transformers/pull/39338
3,219,134,571
PR_kwDOCUB6oc6eS6CQ
39,338
Responses API (to be merged into #39155)
{ "login": "gante", "id": 12240844, "node_id": "MDQ6VXNlcjEyMjQwODQ0", "avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gante", "html_url": "https://github.com/gante", "followers_url": "https://api.github.com/users/gante/followers", "following_url": "https://api.github.com/users/gante/following{/other_user}", "gists_url": "https://api.github.com/users/gante/gists{/gist_id}", "starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gante/subscriptions", "organizations_url": "https://api.github.com/users/gante/orgs", "repos_url": "https://api.github.com/users/gante/repos", "events_url": "https://api.github.com/users/gante/events{/privacy}", "received_events_url": "https://api.github.com/users/gante/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
[]
2025-07-10T12:19:45
2025-07-16T07:26:06
2025-07-16T07:26:06
MEMBER
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/39338", "html_url": "https://github.com/huggingface/transformers/pull/39338", "diff_url": "https://github.com/huggingface/transformers/pull/39338.diff", "patch_url": "https://github.com/huggingface/transformers/pull/39338.patch", "merged_at": "2025-07-16T07:26:06" }
# What does this PR do? To be merged into #39155 ### Testing Works with the [OpenAI responses starter app](https://github.com/openai/openai-responses-starter-app/): 1. launch server as `transformers serve --force-model HuggingFaceTB/SmolLM3-3B` 2. launch app as `OPENAI_API_KEY="<KEY>" OPENAI_BASE_URL=http://localhost:8000/v1 npm run dev` Test script: ```py # Launch `transformers serve` in another terminal first from openai import OpenAI client = OpenAI(base_url="http://localhost:8000/v1", api_key="<KEY>") response = client.responses.create( model="Qwen/Qwen2.5-0.5B-Instruct", instructions="You are a helpful assistant.", input="Hello!", stream=True, metadata={"foo": "bar"}, ) for event in response: print(event) ```
{ "login": "LysandreJik", "id": 30755778, "node_id": "MDQ6VXNlcjMwNzU1Nzc4", "avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4", "gravatar_id": "", "url": "https://api.github.com/users/LysandreJik", "html_url": "https://github.com/LysandreJik", "followers_url": "https://api.github.com/users/LysandreJik/followers", "following_url": "https://api.github.com/users/LysandreJik/following{/other_user}", "gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}", "starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions", "organizations_url": "https://api.github.com/users/LysandreJik/orgs", "repos_url": "https://api.github.com/users/LysandreJik/repos", "events_url": "https://api.github.com/users/LysandreJik/events{/privacy}", "received_events_url": "https://api.github.com/users/LysandreJik/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39338/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39338/timeline
null
null
null
null
true
true
https://api.github.com/repos/huggingface/transformers/issues/39337
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/39337/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/39337/comments
https://api.github.com/repos/huggingface/transformers/issues/39337/events
https://github.com/huggingface/transformers/issues/39337
3,219,106,744
I_kwDOCUB6oc6_36u4
39,337
Latest Transformers release causes CUDA out-of-memory errors during VisionLLM fine-tuning
{ "login": "ruiite", "id": 36436093, "node_id": "MDQ6VXNlcjM2NDM2MDkz", "avatar_url": "https://avatars.githubusercontent.com/u/36436093?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ruiite", "html_url": "https://github.com/ruiite", "followers_url": "https://api.github.com/users/ruiite/followers", "following_url": "https://api.github.com/users/ruiite/following{/other_user}", "gists_url": "https://api.github.com/users/ruiite/gists{/gist_id}", "starred_url": "https://api.github.com/users/ruiite/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ruiite/subscriptions", "organizations_url": "https://api.github.com/users/ruiite/orgs", "repos_url": "https://api.github.com/users/ruiite/repos", "events_url": "https://api.github.com/users/ruiite/events{/privacy}", "received_events_url": "https://api.github.com/users/ruiite/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 3817266200, "node_id": "MDU6TGFiZWwzODE3MjY2MjAw", "url": "https://api.github.com/repos/huggingface/transformers/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": null } ]
closed
false
null
[]
null
[]
2025-07-10T12:11:44
2025-07-14T17:47:36
2025-07-14T17:47:36
NONE
null
null
null
null
### System Info With verion of transfomers == 4.53.0 Script: https://github.com/huggingface/smollm/blob/main/vision/finetuning/SmolVLM2_Video_FT.ipynb This script for tuning SmolVLM2 on video undrestanding causing CUDA out of memory problem. While this script on the same hardware with transformers == 4.51.3 is working perfectly. Vision model was freezed during the training. Not using LORA or QLORA. ### Who can help? _No response_ ### Information - [x] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduction Steps to reproduce: 1. Run script with transformers ==4.51.3 on GPU with 24 Gigabytes of memory with LORA = False, Qlora = False, SMOL = TRUE. 2. Run the same script with transformers==4.53.0. ### Expected behavior With version of transformers of 4.53.0 you will get CUDA out of memory. While with 4.51.3 is not.
{ "login": "ruiite", "id": 36436093, "node_id": "MDQ6VXNlcjM2NDM2MDkz", "avatar_url": "https://avatars.githubusercontent.com/u/36436093?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ruiite", "html_url": "https://github.com/ruiite", "followers_url": "https://api.github.com/users/ruiite/followers", "following_url": "https://api.github.com/users/ruiite/following{/other_user}", "gists_url": "https://api.github.com/users/ruiite/gists{/gist_id}", "starred_url": "https://api.github.com/users/ruiite/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ruiite/subscriptions", "organizations_url": "https://api.github.com/users/ruiite/orgs", "repos_url": "https://api.github.com/users/ruiite/repos", "events_url": "https://api.github.com/users/ruiite/events{/privacy}", "received_events_url": "https://api.github.com/users/ruiite/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/39337/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/39337/timeline
null
completed
{ "total": 0, "completed": 0, "percent_completed": 0 }
{ "blocked_by": 0, "total_blocked_by": 0, "blocking": 0, "total_blocking": 0 }
false
true