url
string | repository_url
string | labels_url
string | comments_url
string | events_url
string | html_url
string | id
int64 | node_id
string | number
int64 | title
string | user
dict | labels
list | state
string | locked
bool | assignee
dict | assignees
list | milestone
null | comments
list | created_at
timestamp[ms] | updated_at
timestamp[ms] | closed_at
timestamp[ms] | author_association
string | type
dict | active_lock_reason
null | draft
bool | pull_request
dict | body
string | closed_by
dict | reactions
dict | timeline_url
string | performed_via_github_app
null | state_reason
string | sub_issues_summary
dict | issue_dependencies_summary
dict | is_pull_request
bool | is_closed
bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/37111
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37111/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37111/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37111/events
|
https://github.com/huggingface/transformers/pull/37111
| 2,958,673,707
|
PR_kwDOCUB6oc6QrxX1
| 37,111
|
Add Fast Image Processor for MobileNetV1
|
{
"login": "dmdaksh",
"id": 27098678,
"node_id": "MDQ6VXNlcjI3MDk4Njc4",
"avatar_url": "https://avatars.githubusercontent.com/u/27098678?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dmdaksh",
"html_url": "https://github.com/dmdaksh",
"followers_url": "https://api.github.com/users/dmdaksh/followers",
"following_url": "https://api.github.com/users/dmdaksh/following{/other_user}",
"gists_url": "https://api.github.com/users/dmdaksh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dmdaksh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dmdaksh/subscriptions",
"organizations_url": "https://api.github.com/users/dmdaksh/orgs",
"repos_url": "https://api.github.com/users/dmdaksh/repos",
"events_url": "https://api.github.com/users/dmdaksh/events{/privacy}",
"received_events_url": "https://api.github.com/users/dmdaksh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-30T05:24:21
| 2025-04-23T19:55:41
| 2025-04-23T19:55:41
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37111",
"html_url": "https://github.com/huggingface/transformers/pull/37111",
"diff_url": "https://github.com/huggingface/transformers/pull/37111.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37111.patch",
"merged_at": "2025-04-23T19:55:41"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Related to #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37111/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37110
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37110/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37110/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37110/events
|
https://github.com/huggingface/transformers/issues/37110
| 2,958,655,770
|
I_kwDOCUB6oc6wWYEa
| 37,110
|
The loss and gradient explosion caused by the trainer
|
{
"login": "rangehow",
"id": 88258534,
"node_id": "MDQ6VXNlcjg4MjU4NTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/88258534?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rangehow",
"html_url": "https://github.com/rangehow",
"followers_url": "https://api.github.com/users/rangehow/followers",
"following_url": "https://api.github.com/users/rangehow/following{/other_user}",
"gists_url": "https://api.github.com/users/rangehow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rangehow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rangehow/subscriptions",
"organizations_url": "https://api.github.com/users/rangehow/orgs",
"repos_url": "https://api.github.com/users/rangehow/repos",
"events_url": "https://api.github.com/users/rangehow/events{/privacy}",
"received_events_url": "https://api.github.com/users/rangehow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-30T04:26:50
| 2025-04-01T09:54:45
| 2025-03-31T13:15:14
|
CONTRIBUTOR
| null | null | null | null |
- `transformers` version: 4.49.0
- Platform: Linux-4.18.0-147.mt20200626.413.el8_1.x86_64-x86_64-with-glibc2.17
- Python version: 3.12.9
- Huggingface_hub version: 0.26.3
- Safetensors version: 0.4.5
- Accelerate version: 1.4.0
- Accelerate config: not found
- DeepSpeed version: 0.15.4
- PyTorch version (GPU?): 2.5.1+cu121 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA Graphics Device
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
To observe the loss for each device in DDP, I overrode the trainer's compute_loss as follows:
```python
from transformers.loss.loss_utils import ForCausalLMLoss
class NTPTrainer(Trainer):
def compute_loss(
self,
model: torch.nn.Module,
inputs: Dict[str, Union[torch.Tensor, Any]],
return_outputs: bool = False,
num_items_in_batch=None,
) -> Union[torch.Tensor, Tuple[torch.Tensor, Dict]]:
labels = inputs.pop("labels", None)
outputs = model(**inputs)
logits = outputs.logits
loss_kwargs = {
"logits": logits,
"labels": labels,
"vocab_size": self.model.config.vocab_size
}
loss = ForCausalLMLoss(**loss_kwargs)
print(loss)
return (loss, outputs) if return_outputs else loss
def _get_train_sampler(self):
return SequentialSampler(self.train_dataset)
```
I compared the loss using my overridden trainer(left) and the default trainer (right). The loss and gradients have both increased by dozens of times, but the way I implemented them is consistent with the loss calculation method provided in the official modeling file.
<img width="822" alt="Image" src="https://github.com/user-attachments/assets/d0426809-0b24-4c11-89c7-0b6c9117a203" />
I am using 8 GPUs, with gradient accumulation set to 16. This print the loss for each device that is output by the trainer.
<img width="449" alt="Image" src="https://github.com/user-attachments/assets/f666a056-3a02-493d-aa6d-9807a38bb7c5" />
The loss for each device is in the single digits, so I'm wondering if gradient accumulation or DDP is not correctly averaging?
### Expected behavior
Aligned with the loss of the standard trainer.
|
{
"login": "rangehow",
"id": 88258534,
"node_id": "MDQ6VXNlcjg4MjU4NTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/88258534?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rangehow",
"html_url": "https://github.com/rangehow",
"followers_url": "https://api.github.com/users/rangehow/followers",
"following_url": "https://api.github.com/users/rangehow/following{/other_user}",
"gists_url": "https://api.github.com/users/rangehow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rangehow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rangehow/subscriptions",
"organizations_url": "https://api.github.com/users/rangehow/orgs",
"repos_url": "https://api.github.com/users/rangehow/repos",
"events_url": "https://api.github.com/users/rangehow/events{/privacy}",
"received_events_url": "https://api.github.com/users/rangehow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37110/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37110/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37109
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37109/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37109/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37109/events
|
https://github.com/huggingface/transformers/pull/37109
| 2,958,651,638
|
PR_kwDOCUB6oc6QrtHO
| 37,109
|
Fix Gemma3 embedding scaling
|
{
"login": "gau-nernst",
"id": 26946864,
"node_id": "MDQ6VXNlcjI2OTQ2ODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/26946864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gau-nernst",
"html_url": "https://github.com/gau-nernst",
"followers_url": "https://api.github.com/users/gau-nernst/followers",
"following_url": "https://api.github.com/users/gau-nernst/following{/other_user}",
"gists_url": "https://api.github.com/users/gau-nernst/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gau-nernst/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gau-nernst/subscriptions",
"organizations_url": "https://api.github.com/users/gau-nernst/orgs",
"repos_url": "https://api.github.com/users/gau-nernst/repos",
"events_url": "https://api.github.com/users/gau-nernst/events{/privacy}",
"received_events_url": "https://api.github.com/users/gau-nernst/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-30T04:13:53
| 2025-03-31T09:04:42
| 2025-03-31T09:04:02
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37109",
"html_url": "https://github.com/huggingface/transformers/pull/37109",
"diff_url": "https://github.com/huggingface/transformers/pull/37109.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37109.patch",
"merged_at": "2025-03-31T09:04:02"
}
|
# What does this PR do?
- Correctly apply embedding scaling per [official Gemma 3 impl](https://github.com/google-deepmind/gemma/blob/ebfa2c2bddcab1d7cc38b1dca0fe6d98e6e7df71/gemma/modules.py#L93-L96) (cast scale to BF16). Gemma 2 has this done in #29402
- Use non-persistent buffer instead of plain Python float to make it more torch.compile-friendly (I have encountered cases where this is necessary, though it would be too verbose to elaborate it here)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @RyanMullins
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37109/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37109/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37108
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37108/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37108/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37108/events
|
https://github.com/huggingface/transformers/pull/37108
| 2,958,457,694
|
PR_kwDOCUB6oc6QrHit
| 37,108
|
Add Fast Grounding-Dino Processor
|
{
"login": "keetrap",
"id": 103131112,
"node_id": "U_kgDOBiWn6A",
"avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keetrap",
"html_url": "https://github.com/keetrap",
"followers_url": "https://api.github.com/users/keetrap/followers",
"following_url": "https://api.github.com/users/keetrap/following{/other_user}",
"gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keetrap/subscriptions",
"organizations_url": "https://api.github.com/users/keetrap/orgs",
"repos_url": "https://api.github.com/users/keetrap/repos",
"events_url": "https://api.github.com/users/keetrap/events{/privacy}",
"received_events_url": "https://api.github.com/users/keetrap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T19:40:09
| 2025-04-16T10:26:08
| 2025-04-16T10:26:08
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37108",
"html_url": "https://github.com/huggingface/transformers/pull/37108",
"diff_url": "https://github.com/huggingface/transformers/pull/37108.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37108.patch",
"merged_at": "2025-04-16T10:26:08"
}
|
Related #36978
cc @yonigozlan
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37108/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37107
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37107/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37107/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37107/events
|
https://github.com/huggingface/transformers/pull/37107
| 2,958,211,992
|
PR_kwDOCUB6oc6QqTMi
| 37,107
|
fix: Add 'image-text-to-text' to `TASK_MAPPING`
|
{
"login": "saattrupdan",
"id": 47701536,
"node_id": "MDQ6VXNlcjQ3NzAxNTM2",
"avatar_url": "https://avatars.githubusercontent.com/u/47701536?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saattrupdan",
"html_url": "https://github.com/saattrupdan",
"followers_url": "https://api.github.com/users/saattrupdan/followers",
"following_url": "https://api.github.com/users/saattrupdan/following{/other_user}",
"gists_url": "https://api.github.com/users/saattrupdan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saattrupdan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saattrupdan/subscriptions",
"organizations_url": "https://api.github.com/users/saattrupdan/orgs",
"repos_url": "https://api.github.com/users/saattrupdan/repos",
"events_url": "https://api.github.com/users/saattrupdan/events{/privacy}",
"received_events_url": "https://api.github.com/users/saattrupdan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T16:26:40
| 2025-04-02T13:57:45
| 2025-04-02T12:51:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37107",
"html_url": "https://github.com/huggingface/transformers/pull/37107",
"diff_url": "https://github.com/huggingface/transformers/pull/37107.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37107.patch",
"merged_at": "2025-04-02T12:51:03"
}
|
# What does this PR do?
The `TASK_MAPPING` dictionary, mapping pipeline tags to the list of architectures that support that task, is missing the new [image-text-to-text pipeline tag](https://huggingface.co/models?pipeline_tag=image-text-to-text). This PR simply adds that missing one.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @amyeroberts @qubvel
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37107/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37106
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37106/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37106/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37106/events
|
https://github.com/huggingface/transformers/pull/37106
| 2,958,056,506
|
PR_kwDOCUB6oc6QqFSd
| 37,106
|
Add Sdpa Support: `[Electra]`
|
{
"login": "nnilayy",
"id": 114939419,
"node_id": "U_kgDOBtnWGw",
"avatar_url": "https://avatars.githubusercontent.com/u/114939419?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nnilayy",
"html_url": "https://github.com/nnilayy",
"followers_url": "https://api.github.com/users/nnilayy/followers",
"following_url": "https://api.github.com/users/nnilayy/following{/other_user}",
"gists_url": "https://api.github.com/users/nnilayy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nnilayy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nnilayy/subscriptions",
"organizations_url": "https://api.github.com/users/nnilayy/orgs",
"repos_url": "https://api.github.com/users/nnilayy/repos",
"events_url": "https://api.github.com/users/nnilayy/events{/privacy}",
"received_events_url": "https://api.github.com/users/nnilayy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-29T14:37:26
| 2025-03-31T10:32:38
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37106",
"html_url": "https://github.com/huggingface/transformers/pull/37106",
"diff_url": "https://github.com/huggingface/transformers/pull/37106.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37106.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Towards #28005 and #37105
Adds SDPA (Scaled Dot-Product Attention) support for Google's Electra 🤗.
Sample benchmarks comparing `sdpa` and `eager` attention for both the `google/electra-base-generator` and `google/electra-base-discriminator` models, under both training and inference, are provided below.
Benchmarking scripts (adapted from @fxmarty's SDPA scripts) for the same, for training, inference, and execution of Electra Model on task of MaskedLM were made and are also linked below.
Electra Sdpa Benchmarking Scripts:
- [`Electra`: Benchmark Inference Script](https://gist.github.com/nnilayy/84e037104ad2e6697008fcd8a28ce316)
- [`Electra`: Benchmark Training Script](https://gist.github.com/nnilayy/998b237a45cb3cca8acc27b47049b899)
- [`Electra`: Execution Commands](https://gist.github.com/nnilayy/db493e3464c99f485990dd45e22f8a34)
Reference Sdpa Benchmarking Scripts by @fxmarty :
- [Benchmark Inference Script](https://gist.github.com/fxmarty/5113e4304fbdd38c9c3702ce44683f6a)
- [Benchmark Training Script](https://gist.github.com/fxmarty/7e75cc3942d6974e4849093ebea0a331)
**PS:** `Memory Savings %`'s remained consistent across the respective runs, but `Speed %`'s varied. To ensure reliability, each benchmark (for both training and inference on both models) was run **five times**, and the reported results are the **mean across all runs**. The full set of **individual run results** is linked in [this Benchmarking Runs gist](https://gist.github.com/nnilayy/1decc65a96f91d9ad9b6971801a25e81).
### Benchmarks For `google/electra-base-generator`
#### _Training Benchmark_
| num_training_steps | batch_size | seq_len | is_cuda | is_half | Time per batch (eager - s) | Time per batch (sdpa - s) | Speedup (%) | Eager peak mem (MB) | sdpa peak mem (MB) | Mem saving (%) |
|--------------------|------------|---------|---------|---------|----------------------------|---------------------------|-------------|---------------------|--------------------|----------------|
| 100 | 2 | 32 | True | True | 0.0314 | 0.024 | 30.2028 | 249.684 | 249.684 | 0.0 |
| 100 | 2 | 64 | True | True | 0.031 | 0.0244 | 27.0706 | 253.593 | 253.593 | 0.0 |
| 100 | 2 | 128 | True | True | 0.0308 | 0.0238 | 30.0252 | 261.409 | 261.409 | 0.0 |
| 100 | 2 | 256 | True | True | 0.03 | 0.024 | 26.7504 | 299.593 | 277.243 | 8.061 |
| 100 | 4 | 32 | True | True | 0.0308 | 0.024 | 28.1694 | 253.593 | 253.593 | 0.0 |
| 100 | 4 | 64 | True | True | 0.0306 | 0.0234 | 30.8344 | 261.409 | 261.409 | 0.0 |
| 100 | 4 | 128 | True | True | 0.0302 | 0.0236 | 27.4524 | 283.864 | 277.243 | 2.388 |
| 100 | 4 | 256 | True | True | 0.031 | 0.0252 | 22.8936 | 517.463 | 454.745 | 13.792 |
| 100 | 8 | 32 | True | True | 0.0304 | 0.0232 | 31.0826 | 261.409 | 261.409 | 0.0 |
| 100 | 8 | 64 | True | True | 0.03 | 0.0234 | 27.6406 | 277.243 | 277.243 | 0.0 |
| 100 | 8 | 128 | True | True | 0.031 | 0.0246 | 25.3224 | 486.006 | 454.745 | 6.874 |
| 100 | 8 | 256 | True | True | 0.04 | 0.0394 | 2.3676 | 947.451 | 821.915 | 15.274 |
| 100 | 16 | 32 | True | True | 0.03 | 0.0238 | 26.6688 | 277.446 | 277.446 | 0.0 |
| 100 | 16 | 64 | True | True | 0.0312 | 0.0242 | 28.5026 | 470.277 | 454.745 | 3.416 |
| 100 | 16 | 128 | True | True | 0.0368 | 0.0378 | -2.7766 | 884.536 | 821.915 | 7.619 |
| 100 | 16 | 256 | True | True | 0.0774 | 0.0712 | 8.6964 | 1804.67 | 1553.798 | 16.146 |
#### _Inference Benchmark_
| num_batches | batch_size | seq_len | is_cuda | is_half | use_mask | Per token latency (eager - ms) | Per token latency (sdpa - ms) | Speedup (%) | Mem eager (MB) | Mem sdpa (MB) | Mem saved (%) |
|-------------|------------|---------|---------|---------|----------|-------------------------------|------------------------------|-------------|----------------|---------------|---------------|
| 50 | 2 | 32 | True | True | True | 0.1566 | 0.1176 | 33.217 | 85.004 | 85.004 | 0.0 |
| 50 | 2 | 64 | True | True | True | 0.0778 | 0.0584 | 33.349 | 92.949 | 92.949 | 0.0 |
| 50 | 2 | 128 | True | True | True | 0.0424 | 0.0288 | 46.1208 | 108.84 | 109.043 | -0.186 |
| 50 | 2 | 256 | True | True | True | 0.021 | 0.0164 | 27.676 | 140.825 | 140.825 | 0.0 |
| 50 | 4 | 32 | True | True | True | 0.0764 | 0.0604 | 26.8224 | 92.949 | 92.949 | 0.0 |
| 50 | 4 | 64 | True | True | True | 0.0386 | 0.0294 | 30.849 | 109.043 | 109.043 | 0.0 |
| 50 | 4 | 128 | True | True | True | 0.0198 | 0.0158 | 25.8548 | 140.825 | 140.825 | 0.0 |
| 50 | 4 | 256 | True | True | True | 0.0112 | 0.009 | 25.464 | 204.997 | 204.997 | 0.0 |
| 50 | 8 | 32 | True | True | True | 0.0372 | 0.0282 | 31.764 | 109.043 | 109.043 | 0.0 |
| 50 | 8 | 64 | True | True | True | 0.0194 | 0.0156 | 24.4376 | 140.825 | 140.825 | 0.0 |
| 50 | 8 | 128 | True | True | True | 0.0108 | 0.0092 | 21.5298 | 204.997 | 204.997 | 0.0 |
| 50 | 8 | 256 | True | True | True | 0.007 | 0.006 | 18.5566 | 332.935 | 332.935 | 0.0 |
| 50 | 16 | 32 | True | True | True | 0.02 | 0.0152 | 32.5968 | 140.825 | 140.825 | 0.0 |
| 50 | 16 | 64 | True | True | True | 0.0112 | 0.0092 | 23.3076 | 204.997 | 204.997 | 0.0 |
| 50 | 16 | 128 | True | True | True | 0.007 | 0.006 | 17.745 | 332.935 | 332.935 | 0.0 |
| 50 | 16 | 256 | True | True | True | 0.006 | 0.005 | 21.8322 | 585.568 | 585.568 | 0.0 |
### Benchmarks For `google/electra-base-discriminator`
#### _Training Benchmark_
| num_training_steps | batch_size | seq_len | is_cuda | is_half | Time per batch (eager - s) | Time per batch (sdpa - s) | Speedup (%) | Eager peak mem (MB) | sdpa peak mem (MB) | Mem saving (%) |
|--------------------|------------|---------|---------|---------|----------------------------|---------------------------|-------------|---------------------|--------------------|----------------|
| 100 | 2 | 32 | True | True | 0.0312 | 0.0242 | 29.8518 | 568.647 | 566.126 | 0.445 |
| 100 | 2 | 64 | True | True | 0.0308 | 0.024 | 29.3112 | 572.187 | 570.846 | 0.235 |
| 100 | 2 | 128 | True | True | 0.0312 | 0.0248 | 25.9492 | 588.412 | 587.232 | 0.201 |
| 100 | 2 | 256 | True | True | 0.031 | 0.0298 | 5.5548 | 631.001 | 602.644 | 4.705 |
| 100 | 4 | 32 | True | True | 0.0312 | 0.0238 | 30.2574 | 572.187 | 570.846 | 0.235 |
| 100 | 4 | 64 | True | True | 0.0312 | 0.0242 | 28.247 | 588.412 | 587.232 | 0.201 |
| 100 | 4 | 128 | True | True | 0.0314 | 0.0286 | 10.2442 | 604.379 | 602.644 | 0.288 |
| 100 | 4 | 256 | True | True | 0.053 | 0.0498 | 6.4654 | 1022.623 | 831.197 | 23.0304 |
| 100 | 8 | 32 | True | True | 0.031 | 0.0242 | 28.8872 | 587.263 | 586.813 | 0.077 |
| 100 | 8 | 64 | True | True | 0.0308 | 0.0278 | 10.7408 | 602.151 | 601.596 | 0.0924 |
| 100 | 8 | 128 | True | True | 0.0492 | 0.0474 | 3.8086 | 927.308 | 831.197 | 11.5626 |
| 100 | 8 | 256 | True | True | 0.105 | 0.0976 | 7.4594 | 1798.541 | 1418.122 | 26.8258 |
| 100 | 16 | 32 | True | True | 0.0314 | 0.0286 | 10.8604 | 598.278 | 595.934 | 0.3934 |
| 100 | 16 | 64 | True | True | 0.048 | 0.0464 | 3.7276 | 877.185 | 828.890 | 5.8264 |
| 100 | 16 | 128 | True | True | 0.0986 | 0.0944 | 4.3308 | 1605.603 | 1415.605 | 13.4216 |
| 100 | 16 | 256 | True | True | 0.2188 | 0.2036 | 7.4636 | 3318.753 | 2564.827 | 29.395 |
#### _Inference Benchmark_
| num_batches | batch_size | seq_len | is_cuda | is_half | use_mask | Per token latency (eager - ms) | Per token latency (sdpa - ms) | Speedup (%) | Mem eager (MB) | Mem sdpa (MB) | Mem saved (%) |
|-------------|------------|---------|---------|---------|----------|-------------------------------|------------------------------|-------------|----------------|---------------|---------------|
| 50 | 2 | 32 | True | True | True | 0.1586 | 0.115 | 37.9428 | 244.13 | 243.868 | 0.107 |
| 50 | 2 | 64 | True | True | True | 0.0782 | 0.0568 | 37.5862 | 252.141 | 251.878 | 0.104 |
| 50 | 2 | 128 | True | True | True | 0.0424 | 0.03 | 40.611 | 268.163 | 268.103 | 0.022 |
| 50 | 2 | 256 | True | True | True | 0.0204 | 0.0166 | 24.129 | 300.409 | 300.147 | 0.087 |
| 50 | 4 | 32 | True | True | True | 0.0738 | 0.061 | 22.644 | 252.141 | 251.878 | 0.104 |
| 50 | 4 | 64 | True | True | True | 0.0382 | 0.0296 | 28.352 | 268.365 | 268.103 | 0.098 |
| 50 | 4 | 128 | True | True | True | 0.0206 | 0.016 | 27.1508 | 300.409 | 300.147 | 0.087 |
| 50 | 4 | 256 | True | True | True | 0.017 | 0.014 | 22.3272 | 365.106 | 364.844 | 0.072 |
| 50 | 8 | 32 | True | True | True | 0.0378 | 0.0292 | 29.0196 | 268.365 | 268.103 | 0.098 |
| 50 | 8 | 64 | True | True | True | 0.0204 | 0.016 | 27.1664 | 300.409 | 300.147 | 0.087 |
| 50 | 8 | 128 | True | True | True | 0.0158 | 0.0138 | 14.3932 | 365.106 | 364.844 | 0.072 |
| 50 | 8 | 256 | True | True | True | 0.017 | 0.014 | 21.5304 | 494.093 | 493.962 | 0.027 |
| 50 | 16 | 32 | True | True | True | 0.02 | 0.0164 | 20.9608 | 300.409 | 300.147 | 0.087 |
| 50 | 16 | 64 | True | True | True | 0.015 | 0.0138 | 10.5592 | 365.106 | 364.844 | 0.072 |
| 50 | 16 | 128 | True | True | True | 0.0156 | 0.0138 | 14.2278 | 494.093 | 493.962 | 0.027 |
| 50 | 16 | 256 | True | True | True | 0.017 | 0.0142 | 18.969 | 748.823 | 748.561 | 0.035 |
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@fxmarty @ArthurZucker @amyeroberts @LysandreJik
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37106/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37106/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37105
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37105/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37105/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37105/events
|
https://github.com/huggingface/transformers/issues/37105
| 2,958,056,071
|
I_kwDOCUB6oc6wUFqH
| 37,105
|
Add Sdpa Support for `Electra`
|
{
"login": "nnilayy",
"id": 114939419,
"node_id": "U_kgDOBtnWGw",
"avatar_url": "https://avatars.githubusercontent.com/u/114939419?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nnilayy",
"html_url": "https://github.com/nnilayy",
"followers_url": "https://api.github.com/users/nnilayy/followers",
"following_url": "https://api.github.com/users/nnilayy/following{/other_user}",
"gists_url": "https://api.github.com/users/nnilayy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nnilayy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nnilayy/subscriptions",
"organizations_url": "https://api.github.com/users/nnilayy/orgs",
"repos_url": "https://api.github.com/users/nnilayy/repos",
"events_url": "https://api.github.com/users/nnilayy/events{/privacy}",
"received_events_url": "https://api.github.com/users/nnilayy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-29T14:36:26
| 2025-03-29T14:36:26
| null |
CONTRIBUTOR
| null | null | null | null |
### Feature request
To bake Sdpa support into `transformers` for Google's Electra.
### Motivation
Towards [#28005](https://github.com/huggingface/transformers/issues/28005),
To extend support for PyTorch's Scaled Dot Product Attention across model architectures listed in transformers, including Electra, enabling faster inference and improved training performance. Helping unify attention implementations and align older models with newer performance standards.
### Your contribution
Added Sdpa Support in the linked PR below 🤗.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37105/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37104
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37104/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37104/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37104/events
|
https://github.com/huggingface/transformers/pull/37104
| 2,958,015,966
|
PR_kwDOCUB6oc6Qp9Vg
| 37,104
|
Update model-card for DINOv2
|
{
"login": "shubham0204",
"id": 41076823,
"node_id": "MDQ6VXNlcjQxMDc2ODIz",
"avatar_url": "https://avatars.githubusercontent.com/u/41076823?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shubham0204",
"html_url": "https://github.com/shubham0204",
"followers_url": "https://api.github.com/users/shubham0204/followers",
"following_url": "https://api.github.com/users/shubham0204/following{/other_user}",
"gists_url": "https://api.github.com/users/shubham0204/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shubham0204/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shubham0204/subscriptions",
"organizations_url": "https://api.github.com/users/shubham0204/orgs",
"repos_url": "https://api.github.com/users/shubham0204/repos",
"events_url": "https://api.github.com/users/shubham0204/events{/privacy}",
"received_events_url": "https://api.github.com/users/shubham0204/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T13:23:13
| 2025-04-08T09:24:39
| 2025-04-07T17:11:09
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37104",
"html_url": "https://github.com/huggingface/transformers/pull/37104",
"diff_url": "https://github.com/huggingface/transformers/pull/37104.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37104.patch",
"merged_at": "2025-04-07T17:11:09"
}
|
# What does this PR do?
This PR updates the model-card for the `dinov2` model, as described in #36979, in an attempt to standardize all model-cards.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37104/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37104/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37103
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37103/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37103/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37103/events
|
https://github.com/huggingface/transformers/issues/37103
| 2,958,009,962
|
I_kwDOCUB6oc6wT6Zq
| 37,103
|
BLIP-2 float16 example does not work
|
{
"login": "nhatkhtn",
"id": 61368343,
"node_id": "MDQ6VXNlcjYxMzY4MzQz",
"avatar_url": "https://avatars.githubusercontent.com/u/61368343?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nhatkhtn",
"html_url": "https://github.com/nhatkhtn",
"followers_url": "https://api.github.com/users/nhatkhtn/followers",
"following_url": "https://api.github.com/users/nhatkhtn/following{/other_user}",
"gists_url": "https://api.github.com/users/nhatkhtn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nhatkhtn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nhatkhtn/subscriptions",
"organizations_url": "https://api.github.com/users/nhatkhtn/orgs",
"repos_url": "https://api.github.com/users/nhatkhtn/repos",
"events_url": "https://api.github.com/users/nhatkhtn/events{/privacy}",
"received_events_url": "https://api.github.com/users/nhatkhtn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T13:11:21
| 2025-04-16T02:26:30
| 2025-04-16T02:26:29
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.50.0
- Platform: Linux-4.18.0-553.40.1.el8_10.x86_64-x86_64-with-glibc2.28
- Python version: 3.12.8
- Huggingface_hub version: 0.28.0
- Safetensors version: 0.5.2
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA A100-SXM4-80GB
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
The following code, taken from [the official documentation](https://huggingface.co/docs/transformers/main/en/model_doc/blip-2#transformers.Blip2Model), does not work.
```python
from PIL import Image
import requests
from transformers import Blip2Processor, Blip2Model
import torch
device = "cuda" if torch.cuda.is_available() else "cpu"
processor = Blip2Processor.from_pretrained("Salesforce/blip2-opt-2.7b")
model = Blip2Model.from_pretrained("Salesforce/blip2-opt-2.7b", torch_dtype=torch.float16)
model.to(device)
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
prompt = "Question: how many cats are there? Answer:"
inputs = processor(images=image, text=prompt, return_tensors="pt").to(device, torch.float16)
outputs = model(**inputs)
```
Running the above code gives the error
```python
RuntimeError: expected scalar type Float but found Half
```
### Expected behavior
There should be no errors.
|
{
"login": "nhatkhtn",
"id": 61368343,
"node_id": "MDQ6VXNlcjYxMzY4MzQz",
"avatar_url": "https://avatars.githubusercontent.com/u/61368343?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nhatkhtn",
"html_url": "https://github.com/nhatkhtn",
"followers_url": "https://api.github.com/users/nhatkhtn/followers",
"following_url": "https://api.github.com/users/nhatkhtn/following{/other_user}",
"gists_url": "https://api.github.com/users/nhatkhtn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nhatkhtn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nhatkhtn/subscriptions",
"organizations_url": "https://api.github.com/users/nhatkhtn/orgs",
"repos_url": "https://api.github.com/users/nhatkhtn/repos",
"events_url": "https://api.github.com/users/nhatkhtn/events{/privacy}",
"received_events_url": "https://api.github.com/users/nhatkhtn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37103/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37103/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37102
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37102/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37102/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37102/events
|
https://github.com/huggingface/transformers/pull/37102
| 2,958,002,088
|
PR_kwDOCUB6oc6Qp6hU
| 37,102
|
Fix some code annotation typos.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T12:53:40
| 2025-04-03T00:45:40
| 2025-04-02T13:00:42
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37102",
"html_url": "https://github.com/huggingface/transformers/pull/37102",
"diff_url": "https://github.com/huggingface/transformers/pull/37102.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37102.patch",
"merged_at": "2025-04-02T13:00:42"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37102/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37102/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37101
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37101/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37101/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37101/events
|
https://github.com/huggingface/transformers/pull/37101
| 2,957,936,361
|
PR_kwDOCUB6oc6Qps7J
| 37,101
|
Update Model card for GPT2
|
{
"login": "ash-01xor",
"id": 67604126,
"node_id": "MDQ6VXNlcjY3NjA0MTI2",
"avatar_url": "https://avatars.githubusercontent.com/u/67604126?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ash-01xor",
"html_url": "https://github.com/ash-01xor",
"followers_url": "https://api.github.com/users/ash-01xor/followers",
"following_url": "https://api.github.com/users/ash-01xor/following{/other_user}",
"gists_url": "https://api.github.com/users/ash-01xor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ash-01xor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ash-01xor/subscriptions",
"organizations_url": "https://api.github.com/users/ash-01xor/orgs",
"repos_url": "https://api.github.com/users/ash-01xor/repos",
"events_url": "https://api.github.com/users/ash-01xor/events{/privacy}",
"received_events_url": "https://api.github.com/users/ash-01xor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T10:52:48
| 2025-04-07T17:15:28
| 2025-04-07T17:15:28
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37101",
"html_url": "https://github.com/huggingface/transformers/pull/37101",
"diff_url": "https://github.com/huggingface/transformers/pull/37101.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37101.patch",
"merged_at": "2025-04-07T17:15:28"
}
|
# What does this PR do?
This PR updates the model-card for the GPT2 model, as described in https://github.com/huggingface/transformers/issues/36979, in an attempt to standardize all model-cards.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37101/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37100
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37100/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37100/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37100/events
|
https://github.com/huggingface/transformers/pull/37100
| 2,957,915,139
|
PR_kwDOCUB6oc6QpooL
| 37,100
|
Fix std initialization in Idefics variants
|
{
"login": "yaswanth19",
"id": 82788246,
"node_id": "MDQ6VXNlcjgyNzg4MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/82788246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaswanth19",
"html_url": "https://github.com/yaswanth19",
"followers_url": "https://api.github.com/users/yaswanth19/followers",
"following_url": "https://api.github.com/users/yaswanth19/following{/other_user}",
"gists_url": "https://api.github.com/users/yaswanth19/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaswanth19/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaswanth19/subscriptions",
"organizations_url": "https://api.github.com/users/yaswanth19/orgs",
"repos_url": "https://api.github.com/users/yaswanth19/repos",
"events_url": "https://api.github.com/users/yaswanth19/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaswanth19/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T10:14:37
| 2025-04-01T07:18:54
| 2025-04-01T07:18:54
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37100",
"html_url": "https://github.com/huggingface/transformers/pull/37100",
"diff_url": "https://github.com/huggingface/transformers/pull/37100.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37100.patch",
"merged_at": "2025-04-01T07:18:54"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp A small review here for this nit fix PR
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37100/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37100/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37099
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37099/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37099/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37099/events
|
https://github.com/huggingface/transformers/pull/37099
| 2,957,914,350
|
PR_kwDOCUB6oc6Qpoex
| 37,099
|
feat: updated model card for qwen_2.5_vl
|
{
"login": "arkhamHack",
"id": 72064090,
"node_id": "MDQ6VXNlcjcyMDY0MDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/72064090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arkhamHack",
"html_url": "https://github.com/arkhamHack",
"followers_url": "https://api.github.com/users/arkhamHack/followers",
"following_url": "https://api.github.com/users/arkhamHack/following{/other_user}",
"gists_url": "https://api.github.com/users/arkhamHack/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arkhamHack/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arkhamHack/subscriptions",
"organizations_url": "https://api.github.com/users/arkhamHack/orgs",
"repos_url": "https://api.github.com/users/arkhamHack/repos",
"events_url": "https://api.github.com/users/arkhamHack/events{/privacy}",
"received_events_url": "https://api.github.com/users/arkhamHack/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T10:13:01
| 2025-04-03T16:13:26
| 2025-04-03T16:13:26
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37099",
"html_url": "https://github.com/huggingface/transformers/pull/37099",
"diff_url": "https://github.com/huggingface/transformers/pull/37099.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37099.patch",
"merged_at": "2025-04-03T16:13:26"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
https://github.com/huggingface/transformers/issues/36979
This pr updates the model card for Qwen2.5-VL
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37099/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37098
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37098/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37098/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37098/events
|
https://github.com/huggingface/transformers/issues/37098
| 2,957,719,129
|
I_kwDOCUB6oc6wSzZZ
| 37,098
|
Feature Request: Support Canary Models
|
{
"login": "fakerybakery",
"id": 76186054,
"node_id": "MDQ6VXNlcjc2MTg2MDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/76186054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fakerybakery",
"html_url": "https://github.com/fakerybakery",
"followers_url": "https://api.github.com/users/fakerybakery/followers",
"following_url": "https://api.github.com/users/fakerybakery/following{/other_user}",
"gists_url": "https://api.github.com/users/fakerybakery/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fakerybakery/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fakerybakery/subscriptions",
"organizations_url": "https://api.github.com/users/fakerybakery/orgs",
"repos_url": "https://api.github.com/users/fakerybakery/repos",
"events_url": "https://api.github.com/users/fakerybakery/events{/privacy}",
"received_events_url": "https://api.github.com/users/fakerybakery/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-29T04:00:46
| 2025-06-17T02:03:02
| null |
NONE
| null | null | null | null |
### Feature request
Add support for Canary Flash (1B/180M) models:
https://huggingface.co/nvidia/canary-1b-flash
https://huggingface.co/nvidia/canary-180m-flash
### Motivation
The 1B model is the second highest on the [Open ASR Leaderboard](https://huggingface.co/spaces/hf-audio/open_asr_leaderboard), outperformed only by Phi 4 Multimodal, which has 14B parameters.
Currently, to run the model, one must use NeMo. It would be helpful to be able to use it with Transformers.
### Your contribution
Training and inference code: https://github.com/NVIDIA/NeMo
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37098/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37098/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37097
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37097/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37097/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37097/events
|
https://github.com/huggingface/transformers/pull/37097
| 2,957,709,881
|
PR_kwDOCUB6oc6QpBVM
| 37,097
|
Merge tensor operations with device transfer operations
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-29T03:38:06
| 2025-04-02T13:39:45
| 2025-04-02T13:15:23
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37097",
"html_url": "https://github.com/huggingface/transformers/pull/37097",
"diff_url": "https://github.com/huggingface/transformers/pull/37097.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37097.patch",
"merged_at": "2025-04-02T13:15:23"
}
|
This PR merges tensor operations with device transfer operations to avoid additional operations.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37097/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37097/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37096
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37096/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37096/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37096/events
|
https://github.com/huggingface/transformers/pull/37096
| 2,957,352,732
|
PR_kwDOCUB6oc6QnzdE
| 37,096
|
Add Plain-DETR
|
{
"login": "sushmanthreddy",
"id": 73489688,
"node_id": "MDQ6VXNlcjczNDg5Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/73489688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sushmanthreddy",
"html_url": "https://github.com/sushmanthreddy",
"followers_url": "https://api.github.com/users/sushmanthreddy/followers",
"following_url": "https://api.github.com/users/sushmanthreddy/following{/other_user}",
"gists_url": "https://api.github.com/users/sushmanthreddy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sushmanthreddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sushmanthreddy/subscriptions",
"organizations_url": "https://api.github.com/users/sushmanthreddy/orgs",
"repos_url": "https://api.github.com/users/sushmanthreddy/repos",
"events_url": "https://api.github.com/users/sushmanthreddy/events{/privacy}",
"received_events_url": "https://api.github.com/users/sushmanthreddy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-28T22:00:50
| 2025-09-11T07:13:32
| null |
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37096",
"html_url": "https://github.com/huggingface/transformers/pull/37096",
"diff_url": "https://github.com/huggingface/transformers/pull/37096.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37096.patch",
"merged_at": null
}
|
close #27496
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37096/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37096/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37095
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37095/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37095/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37095/events
|
https://github.com/huggingface/transformers/pull/37095
| 2,957,344,327
|
PR_kwDOCUB6oc6QnxlP
| 37,095
|
handle training summary when creating modelcard but offline mode is set
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T21:56:04
| 2025-07-15T15:21:16
| 2025-07-15T15:21:16
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37095",
"html_url": "https://github.com/huggingface/transformers/pull/37095",
"diff_url": "https://github.com/huggingface/transformers/pull/37095.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37095.patch",
"merged_at": "2025-07-15T15:21:15"
}
|
# What does this PR do?
When using the trainer with HF_HUB_OFFLINE=1, I get this stack trace due to an unhandled edge case
```
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/trainer.py:4665: in create_model_card
training_summary = TrainingSummary.from_trainer(
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/modelcard.py:611: in from_trainer
return cls(
<string>:17: in __init__
???
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/transformers/modelcard.py:383: in __post_init__
info = model_info(self.finetuned_from)
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py:114: in _inner_fn
return fn(*args, **kwargs)
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/huggingface_hub/hf_api.py:2518: in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/requests/sessions.py:602: in get
return self.request("GET", url, **kwargs)
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/requests/sessions.py:589: in request
resp = self.send(prep, **send_kwargs)
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/requests/sessions.py:703: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <huggingface_hub.utils._http.OfflineAdapter object at 0x2a7cc82cf3d0>
request = <PreparedRequest [GET]>, args = ()
kwargs = {'cert': None, 'proxies': OrderedDict(), 'stream': False, 'timeout': None, ...}
def send(self, request: PreparedRequest, *args, **kwargs) -> Response:
> raise OfflineModeIsEnabled(
f"Cannot reach {request.url}: offline mode is enabled. To disable it, please unset the `HF_HUB_OFFLINE` environment variable."
)
E huggingface_hub.errors.OfflineModeIsEnabled: Cannot reach https://huggingface.co/api/models/axolotl-ai-co/DeepSeek-V3-11M: offline mode is enabled. To disable it, please unset the `HF_HUB_OFFLINE` environment variable.
/root/miniconda3/envs/py3.11/lib/python3.11/site-packages/huggingface_hub/utils/_http.py:107: OfflineModeIsEnabled
```
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@zach-huggingface @SunMarc
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37095/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37095/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37094
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37094/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37094/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37094/events
|
https://github.com/huggingface/transformers/pull/37094
| 2,957,274,294
|
PR_kwDOCUB6oc6QniEt
| 37,094
|
Add ImageProcessorFast to Efficientnet processor
|
{
"login": "Yann-CV",
"id": 54800486,
"node_id": "MDQ6VXNlcjU0ODAwNDg2",
"avatar_url": "https://avatars.githubusercontent.com/u/54800486?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Yann-CV",
"html_url": "https://github.com/Yann-CV",
"followers_url": "https://api.github.com/users/Yann-CV/followers",
"following_url": "https://api.github.com/users/Yann-CV/following{/other_user}",
"gists_url": "https://api.github.com/users/Yann-CV/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Yann-CV/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Yann-CV/subscriptions",
"organizations_url": "https://api.github.com/users/Yann-CV/orgs",
"repos_url": "https://api.github.com/users/Yann-CV/repos",
"events_url": "https://api.github.com/users/Yann-CV/events{/privacy}",
"received_events_url": "https://api.github.com/users/Yann-CV/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-28T21:23:02
| 2025-03-29T12:32:31
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37094",
"html_url": "https://github.com/huggingface/transformers/pull/37094",
"diff_url": "https://github.com/huggingface/transformers/pull/37094.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37094.patch",
"merged_at": null
}
|
# What does this PR do?
Following https://github.com/huggingface/transformers/issues/36978:
This pull request introduces a new fast image processor for EfficientNet models and integrates it into the existing codebase. The changes include updates to documentation, initialization files, and test cases to support the new `EfficientNetImageProcessorFast`.
### Integration of `EfficientNetImageProcessorFast`:
* [`docs/source/en/model_doc/efficientnet.md`](diffhunk://#diff-cee8ef04c44ba9bde1c112df04e193271b52309b393e78cfc6e4e3fdb6392821R46-R50): Added documentation for `EfficientNetImageProcessorFast`.
* [`src/transformers/__init__.py`](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R1354): Included `EfficientNetImageProcessorFast` in the import structure and import statements. [[1]](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R1354) [[2]](diffhunk://#diff-7723156f6b075b1bf1525f769d90a7cc0b5f233becdcbe28707aaa753960d897R6624)
* [`src/transformers/models/auto/image_processing_auto.py`](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL58-R58): Updated the image processor mapping to include `EfficientNetImageProcessorFast`. [[1]](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL58-R58) [[2]](diffhunk://#diff-dc5c050927ed279b77bac41443778a1155c1cd7825aae50412d70b65bb96397fL85-R85)
* [`src/transformers/models/efficientnet/__init__.py`](diffhunk://#diff-4ac3b97f46b61bd463102c9473fa6a202f9175f8f10ea56e1ac1a22e2ad8bb34R23): Added import for `EfficientNetImageProcessorFast`.
### Implementation of `EfficientNetImageProcessorFast`:
* [`src/transformers/models/efficientnet/image_processing_efficientnet_fast.py`](diffhunk://#diff-864fcb168f4bc5f8a11c57fb7d644e31b6af872dd57391121db4cf84190d7feeR1-R198): Added the implementation of the `EfficientNetImageProcessorFast` class, including methods for preprocessing, rescaling, and normalizing images.
### Testing and Dummy Objects:
* [`src/transformers/utils/dummy_torchvision_objects.py`](diffhunk://#diff-628ebb72d86e41f003144eae7255f5f0e9e60bbf5cb742b5b5447a0360cf1968R61-R67): Added a dummy class for `EfficientNetImageProcessorFast` to handle cases where `torchvision` is not available.
* [`tests/models/efficientnet/test_image_processing_efficientnet.py`](diffhunk://#diff-13b3debfbb1f79ee40a45ed62412b233fc6973d4ed3d8731eeab3782dd2db3adL22-R32): Updated test cases to include `EfficientNetImageProcessorFast` and ensure it is tested alongside the standard `EfficientNetImageProcessor`. [[1]](diffhunk://#diff-13b3debfbb1f79ee40a45ed62412b233fc6973d4ed3d8731eeab3782dd2db3adL22-R32) [[2]](diffhunk://#diff-13b3debfbb1f79ee40a45ed62412b233fc6973d4ed3d8731eeab3782dd2db3adR90) [[3]](diffhunk://#diff-13b3debfbb1f79ee40a45ed62412b233fc6973d4ed3d8731eeab3782dd2db3adL97-R123)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37094/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37094/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37093
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37093/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37093/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37093/events
|
https://github.com/huggingface/transformers/issues/37093
| 2,957,143,222
|
I_kwDOCUB6oc6wQmy2
| 37,093
|
<spam>
|
{
"login": "TarasG0",
"id": 60257917,
"node_id": "MDQ6VXNlcjYwMjU3OTE3",
"avatar_url": "https://avatars.githubusercontent.com/u/60257917?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TarasG0",
"html_url": "https://github.com/TarasG0",
"followers_url": "https://api.github.com/users/TarasG0/followers",
"following_url": "https://api.github.com/users/TarasG0/following{/other_user}",
"gists_url": "https://api.github.com/users/TarasG0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TarasG0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TarasG0/subscriptions",
"organizations_url": "https://api.github.com/users/TarasG0/orgs",
"repos_url": "https://api.github.com/users/TarasG0/repos",
"events_url": "https://api.github.com/users/TarasG0/events{/privacy}",
"received_events_url": "https://api.github.com/users/TarasG0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T20:31:15
| 2025-03-31T12:58:56
| 2025-03-31T12:58:06
|
NONE
| null | null | null | null |
### Model description
taras1
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
<potentially malicious link removed - matt>
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37093/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37093/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37092
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37092/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37092/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37092/events
|
https://github.com/huggingface/transformers/pull/37092
| 2,957,097,830
|
PR_kwDOCUB6oc6QnAtc
| 37,092
|
Llama Kernel integration
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T20:13:15
| 2025-04-14T14:03:07
| 2025-04-10T15:13:25
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37092",
"html_url": "https://github.com/huggingface/transformers/pull/37092",
"diff_url": "https://github.com/huggingface/transformers/pull/37092.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37092.patch",
"merged_at": "2025-04-10T15:13:25"
}
|
# What does this PR do?
PoC of Integrating kernels into the llama model. The kernels used are for `attn`, `mlp`, and `rmsnorm`
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37092/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37092/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37091
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37091/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37091/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37091/events
|
https://github.com/huggingface/transformers/pull/37091
| 2,957,066,596
|
PR_kwDOCUB6oc6Qm54i
| 37,091
|
Kenlm
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T19:55:16
| 2025-03-28T20:42:56
| 2025-03-28T20:42:54
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37091",
"html_url": "https://github.com/huggingface/transformers/pull/37091",
"diff_url": "https://github.com/huggingface/transformers/pull/37091.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37091.patch",
"merged_at": "2025-03-28T20:42:54"
}
|
# What does this PR do?
kenlm has issue, an PR is not merged yet
https://github.com/kpu/kenlm/pull/464
Let's use kenlm@git+https://github.com/ydshieh/kenlm@master
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37091/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37090
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37090/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37090/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37090/events
|
https://github.com/huggingface/transformers/issues/37090
| 2,957,056,542
|
I_kwDOCUB6oc6wQRoe
| 37,090
|
Release Tag Changed, Breaking Checksums, and AUR Package Building
|
{
"login": "daskol",
"id": 9336514,
"node_id": "MDQ6VXNlcjkzMzY1MTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9336514?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/daskol",
"html_url": "https://github.com/daskol",
"followers_url": "https://api.github.com/users/daskol/followers",
"following_url": "https://api.github.com/users/daskol/following{/other_user}",
"gists_url": "https://api.github.com/users/daskol/gists{/gist_id}",
"starred_url": "https://api.github.com/users/daskol/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/daskol/subscriptions",
"organizations_url": "https://api.github.com/users/daskol/orgs",
"repos_url": "https://api.github.com/users/daskol/repos",
"events_url": "https://api.github.com/users/daskol/events{/privacy}",
"received_events_url": "https://api.github.com/users/daskol/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T19:50:55
| 2025-06-02T08:03:39
| 2025-06-02T08:03:39
|
CONTRIBUTOR
| null | null | null | null |
The latest v4.50.2 release of this project has changed its tag and re-published, leading to modified checksums. This breaks package building in ArchLinux's AUR (see [dicussion][1]) and ArchLinux AI unofficial repo. It affects users relying on a stable source archive.
- The AUR package downloads the source from the release (release tag).
- The checksum verification fails because the archive has changed.
- Users are unable to install or update the package.
Please clarify why this change occurred and take steps to prevent such issues in the future. Ensuring stable and predictable releases benefits the entire user and maintainer community. Thank you for your attention to this matter!
[1]: https://aur.archlinux.org/packages/python-transformers
### Expected Behavior
Release tags should remain immutable once published to ensure stability and reproducibility for package maintainers and users.
### Actual Behavior
1. The release tag was changed, leading to different source files.
2. The checksum of the archive no longer matches the originally published one.
3. This disrupts automated and manual builds in Arch Linux AUR.
### Suggested Resolution
Please do not modify or re-upload files under an existing release tag. This causes issues for downstream users and package maintainers.
- If changes are necessary, create a new release with a new tag.
- If a change was unintentional, consider restoring the original archive to maintain consistency.
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37090/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37090/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37089
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37089/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37089/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37089/events
|
https://github.com/huggingface/transformers/pull/37089
| 2,957,021,513
|
PR_kwDOCUB6oc6QmwkL
| 37,089
|
Add model doc for ViTPose with quantization and attention visualization
|
{
"login": "saumanraaj",
"id": 83863464,
"node_id": "MDQ6VXNlcjgzODYzNDY0",
"avatar_url": "https://avatars.githubusercontent.com/u/83863464?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saumanraaj",
"html_url": "https://github.com/saumanraaj",
"followers_url": "https://api.github.com/users/saumanraaj/followers",
"following_url": "https://api.github.com/users/saumanraaj/following{/other_user}",
"gists_url": "https://api.github.com/users/saumanraaj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saumanraaj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saumanraaj/subscriptions",
"organizations_url": "https://api.github.com/users/saumanraaj/orgs",
"repos_url": "https://api.github.com/users/saumanraaj/repos",
"events_url": "https://api.github.com/users/saumanraaj/events{/privacy}",
"received_events_url": "https://api.github.com/users/saumanraaj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T19:33:05
| 2025-05-10T12:28:00
| 2025-05-10T12:27:52
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37089",
"html_url": "https://github.com/huggingface/transformers/pull/37089",
"diff_url": "https://github.com/huggingface/transformers/pull/37089.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37089.patch",
"merged_at": null
}
|
# What does this PR do?
This PR adds a new model documentation file for ViTPose, following the standardized doc format seen in models like BERT
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Before submitting
- [✓ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ✓] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [✓ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ✓] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "saumanraaj",
"id": 83863464,
"node_id": "MDQ6VXNlcjgzODYzNDY0",
"avatar_url": "https://avatars.githubusercontent.com/u/83863464?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saumanraaj",
"html_url": "https://github.com/saumanraaj",
"followers_url": "https://api.github.com/users/saumanraaj/followers",
"following_url": "https://api.github.com/users/saumanraaj/following{/other_user}",
"gists_url": "https://api.github.com/users/saumanraaj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saumanraaj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saumanraaj/subscriptions",
"organizations_url": "https://api.github.com/users/saumanraaj/orgs",
"repos_url": "https://api.github.com/users/saumanraaj/repos",
"events_url": "https://api.github.com/users/saumanraaj/events{/privacy}",
"received_events_url": "https://api.github.com/users/saumanraaj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37089/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37089/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37088
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37088/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37088/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37088/events
|
https://github.com/huggingface/transformers/pull/37088
| 2,956,972,812
|
PR_kwDOCUB6oc6Qmlkf
| 37,088
|
Add Ovis2 model and processor implementation
|
{
"login": "thisisiron",
"id": 23303033,
"node_id": "MDQ6VXNlcjIzMzAzMDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/23303033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thisisiron",
"html_url": "https://github.com/thisisiron",
"followers_url": "https://api.github.com/users/thisisiron/followers",
"following_url": "https://api.github.com/users/thisisiron/following{/other_user}",
"gists_url": "https://api.github.com/users/thisisiron/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thisisiron/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thisisiron/subscriptions",
"organizations_url": "https://api.github.com/users/thisisiron/orgs",
"repos_url": "https://api.github.com/users/thisisiron/repos",
"events_url": "https://api.github.com/users/thisisiron/events{/privacy}",
"received_events_url": "https://api.github.com/users/thisisiron/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T19:10:17
| 2025-08-18T14:22:34
| 2025-08-18T14:05:50
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37088",
"html_url": "https://github.com/huggingface/transformers/pull/37088",
"diff_url": "https://github.com/huggingface/transformers/pull/37088.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37088.patch",
"merged_at": "2025-08-18T14:05:49"
}
|
# What does this PR do?
#36824
Add [Ovis2](https://github.com/AIDC-AI/Ovis) to Transformers.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37088/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37088/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37087
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37087/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37087/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37087/events
|
https://github.com/huggingface/transformers/issues/37087
| 2,956,682,174
|
I_kwDOCUB6oc6wO2O-
| 37,087
|
LLaVa_mistral models are unrecognized
|
{
"login": "darshpatel1052",
"id": 147577241,
"node_id": "U_kgDOCMvZmQ",
"avatar_url": "https://avatars.githubusercontent.com/u/147577241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/darshpatel1052",
"html_url": "https://github.com/darshpatel1052",
"followers_url": "https://api.github.com/users/darshpatel1052/followers",
"following_url": "https://api.github.com/users/darshpatel1052/following{/other_user}",
"gists_url": "https://api.github.com/users/darshpatel1052/gists{/gist_id}",
"starred_url": "https://api.github.com/users/darshpatel1052/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/darshpatel1052/subscriptions",
"organizations_url": "https://api.github.com/users/darshpatel1052/orgs",
"repos_url": "https://api.github.com/users/darshpatel1052/repos",
"events_url": "https://api.github.com/users/darshpatel1052/events{/privacy}",
"received_events_url": "https://api.github.com/users/darshpatel1052/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-28T16:54:33
| 2025-03-31T07:10:10
| null |
NONE
| null | null | null | null |
### System Info
### **Issue Title**: Support for `llava_mistral` Model Architecture
---
### **Environment Information**
- **Transformers Version**: `<4.51.0dev0>`
- **Python Version**: `3.10.16'
- **OS**: 'Ubuntu 20.04.02'
- **PyTorch Version**: `2.5.1`
- **CUDA Version**: `11.8`
- **GPU**: `A4000`
### **Describe the Bug**
Reference Code:
```
# Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("microsoft/llava-med-v1.5-mistral-7b")
```
I am trying to load the `microsoft/llava-med-v1.5-mistral-7b` model using `AutoModelForCausalLM.from_pretrained`, but I encounter the following error:
```
Traceback (most recent call last):
File "/home/darsh/DC/llava-med-model/train.py", line 3, in <module>
model = AutoModelForCausalLM.from_pretrained("microsoft/llava-med-v1.5-mistral-7b")
File "/home/darsh/anaconda3/envs/llava-med/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 531, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/home/darsh/anaconda3/envs/llava-med/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1115, in from_pretrained
raise ValueError(
ValueError: The checkpoint you are trying to load has model type `llava_mistral` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`
```
---
### **Expected Behavior**
The model should load successfully using `AutoModelForCausalLM.from_pretrained`.
---
### **Additional Context**
- I have tried upgrading `transformers` to the latest version using:
```bash
pip install --upgrade transformers
```
- I also tried installing the development version of `transformers` from the source:
```bash
pip install git+https://github.com/huggingface/transformers.git
```
However, the issue persists.
- The model type `llava_mistral` seems to be unsupported by the current version of `transformers`. If this architecture is not yet supported, could you provide guidance on when it might be added or how I can manually add support for this model?
---
### **Request**
Please add support for the `llava_mistral` model architecture in the `transformers` library or provide instructions on how to proceed with loading this model.
---
### **Links**
- Model: [microsoft/llava-med-v1.5-mistral-7b](https://huggingface.co/microsoft/llava-med-v1.5-mistral-7b)
---
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Install the required libraries:
```bash
pip install transformers
```
2. Run the following Python script:
```python
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("microsoft/llava-med-v1.5-mistral-7b")
```
3. Observe the error.
---
### Expected behavior
The model should load successfully using `AutoModelForCausalLM.from_pretrained`.
---
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37087/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37087/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37086
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37086/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37086/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37086/events
|
https://github.com/huggingface/transformers/pull/37086
| 2,956,660,458
|
PR_kwDOCUB6oc6Qlhdu
| 37,086
|
Fix state_dict map location when quantized
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T16:43:24
| 2025-03-31T16:14:11
| 2025-03-28T16:57:16
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37086",
"html_url": "https://github.com/huggingface/transformers/pull/37086",
"diff_url": "https://github.com/huggingface/transformers/pull/37086.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37086.patch",
"merged_at": "2025-03-28T16:57:16"
}
|
# What does this PR do?
Fix following https://github.com/huggingface/transformers/pull/35926. If we now allow to load on meta with quantization as well, we need to be coherent on both sides
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37086/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37086/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37085
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37085/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37085/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37085/events
|
https://github.com/huggingface/transformers/pull/37085
| 2,956,639,364
|
PR_kwDOCUB6oc6Qlc3w
| 37,085
|
Purge unused ModelTester code
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T16:33:47
| 2025-04-03T16:48:37
| 2025-04-03T16:48:35
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37085",
"html_url": "https://github.com/huggingface/transformers/pull/37085",
"diff_url": "https://github.com/huggingface/transformers/pull/37085.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37085.patch",
"merged_at": "2025-04-03T16:48:35"
}
|
Delete lots of `ModelTester` methods that are never actually called inside test cases
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37085/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37085/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37084
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37084/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37084/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37084/events
|
https://github.com/huggingface/transformers/pull/37084
| 2,956,630,233
|
PR_kwDOCUB6oc6Qla54
| 37,084
|
Update w/ new account
|
{
"login": "muellerzr",
"id": 7831895,
"node_id": "MDQ6VXNlcjc4MzE4OTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/7831895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/muellerzr",
"html_url": "https://github.com/muellerzr",
"followers_url": "https://api.github.com/users/muellerzr/followers",
"following_url": "https://api.github.com/users/muellerzr/following{/other_user}",
"gists_url": "https://api.github.com/users/muellerzr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/muellerzr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/muellerzr/subscriptions",
"organizations_url": "https://api.github.com/users/muellerzr/orgs",
"repos_url": "https://api.github.com/users/muellerzr/repos",
"events_url": "https://api.github.com/users/muellerzr/events{/privacy}",
"received_events_url": "https://api.github.com/users/muellerzr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T16:30:01
| 2025-03-28T16:43:01
| 2025-03-28T16:43:00
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37084",
"html_url": "https://github.com/huggingface/transformers/pull/37084",
"diff_url": "https://github.com/huggingface/transformers/pull/37084.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37084.patch",
"merged_at": "2025-03-28T16:43:00"
}
|
# What does this PR do?
Making a new account to better handle notifications
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@SunMarc
|
{
"login": "muellerzr",
"id": 7831895,
"node_id": "MDQ6VXNlcjc4MzE4OTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/7831895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/muellerzr",
"html_url": "https://github.com/muellerzr",
"followers_url": "https://api.github.com/users/muellerzr/followers",
"following_url": "https://api.github.com/users/muellerzr/following{/other_user}",
"gists_url": "https://api.github.com/users/muellerzr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/muellerzr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/muellerzr/subscriptions",
"organizations_url": "https://api.github.com/users/muellerzr/orgs",
"repos_url": "https://api.github.com/users/muellerzr/repos",
"events_url": "https://api.github.com/users/muellerzr/events{/privacy}",
"received_events_url": "https://api.github.com/users/muellerzr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37084/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37084/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37083
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37083/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37083/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37083/events
|
https://github.com/huggingface/transformers/pull/37083
| 2,956,603,145
|
PR_kwDOCUB6oc6QlVFZ
| 37,083
|
Test cleanup
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T16:19:04
| 2025-03-28T16:31:04
| 2025-03-28T16:31:04
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37083",
"html_url": "https://github.com/huggingface/transformers/pull/37083",
"diff_url": "https://github.com/huggingface/transformers/pull/37083.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37083.patch",
"merged_at": null
}
|
As a first step to a general refactor / merging of tests, this PR finds a lot of unused test code and purges it.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37083/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37083/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37082
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37082/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37082/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37082/events
|
https://github.com/huggingface/transformers/pull/37082
| 2,956,577,227
|
PR_kwDOCUB6oc6QlPmB
| 37,082
|
[draft] random tests order
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-28T16:08:18
| 2025-06-02T09:03:54
| null |
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37082",
"html_url": "https://github.com/huggingface/transformers/pull/37082",
"diff_url": "https://github.com/huggingface/transformers/pull/37082.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37082.patch",
"merged_at": null
}
|
# What does this PR do?
hello
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37082/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37082/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37081
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37081/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37081/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37081/events
|
https://github.com/huggingface/transformers/pull/37081
| 2,956,484,266
|
PR_kwDOCUB6oc6Qk7JT
| 37,081
|
Add Fast Image Processor for Donut
|
{
"login": "rootonchair",
"id": 23548268,
"node_id": "MDQ6VXNlcjIzNTQ4MjY4",
"avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rootonchair",
"html_url": "https://github.com/rootonchair",
"followers_url": "https://api.github.com/users/rootonchair/followers",
"following_url": "https://api.github.com/users/rootonchair/following{/other_user}",
"gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions",
"organizations_url": "https://api.github.com/users/rootonchair/orgs",
"repos_url": "https://api.github.com/users/rootonchair/repos",
"events_url": "https://api.github.com/users/rootonchair/events{/privacy}",
"received_events_url": "https://api.github.com/users/rootonchair/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T15:26:37
| 2025-04-15T18:01:04
| 2025-04-14T14:24:02
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37081",
"html_url": "https://github.com/huggingface/transformers/pull/37081",
"diff_url": "https://github.com/huggingface/transformers/pull/37081.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37081.patch",
"merged_at": "2025-04-14T14:24:02"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Related #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37081/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37081/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37080
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37080/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37080/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37080/events
|
https://github.com/huggingface/transformers/pull/37080
| 2,956,472,028
|
PR_kwDOCUB6oc6Qk4ak
| 37,080
|
[generate] beam search -- fix output cropping
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T15:21:24
| 2025-03-28T18:44:41
| 2025-03-28T17:57:51
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37080",
"html_url": "https://github.com/huggingface/transformers/pull/37080",
"diff_url": "https://github.com/huggingface/transformers/pull/37080.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37080.patch",
"merged_at": "2025-03-28T17:57:51"
}
|
# What does this PR do?
VLLM is seeing some output differences in their CI when beam search is being used. The difference can be tracked to the beam search refactor (#35802).
Inspecting the outputs, we can see that there are a few additional pad tokens on the right. This is because the output was not being cropped correctly when the selected beam is shorter than the generation length (i.e. when the highest-scoring beam is NOT from the latest decoding iteration, but rather some previously completed beam).
After #35802: output length = input length + number of decoding iterations
Before #35802 and in this PR: output length = length of the longest selected beam
This PR also changed a few beam search tests to check their special tokens, which would have prevented this bug.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37080/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37079
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37079/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37079/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37079/events
|
https://github.com/huggingface/transformers/issues/37079
| 2,956,451,308
|
I_kwDOCUB6oc6wN93s
| 37,079
|
Need to get hidden features from Siglip but ValueError: You have to specify input_ids
|
{
"login": "thotasu",
"id": 25834620,
"node_id": "MDQ6VXNlcjI1ODM0NjIw",
"avatar_url": "https://avatars.githubusercontent.com/u/25834620?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thotasu",
"html_url": "https://github.com/thotasu",
"followers_url": "https://api.github.com/users/thotasu/followers",
"following_url": "https://api.github.com/users/thotasu/following{/other_user}",
"gists_url": "https://api.github.com/users/thotasu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thotasu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thotasu/subscriptions",
"organizations_url": "https://api.github.com/users/thotasu/orgs",
"repos_url": "https://api.github.com/users/thotasu/repos",
"events_url": "https://api.github.com/users/thotasu/events{/privacy}",
"received_events_url": "https://api.github.com/users/thotasu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T15:12:37
| 2025-03-29T11:04:23
| 2025-03-29T11:04:23
|
NONE
| null | null | null | null |
Hi, I need to get hidden features from Siglip but I am stuck with: ValueError - You have to specify input_ids. Please help.
## this is my code
ckpt = "google/siglip2-base-patch16-224"
model = AutoModel.from_pretrained(ckpt, device_map="cpu",
output_hidden_states=True).eval()
image = load_image("https://huggingface.co/datasets/merve/coco/resolve/main/val2017/000000000285.jpg")
inputs = processor(images=[image], return_tensors="pt").to(model.device)
with torch.no_grad():
model_output = model(**inputs) #### throws error
## my intention is to access hidden states later
print(model_output.hidden_states)
## This is the error:
'''
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[16], line 3
1 # run infernece
2 with torch.no_grad():
----> 3 model_output = model(**inputs)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/torch/nn/modules/module.py:1532, in Module._wrapped_call_impl(self, *args, **kwargs)
1530 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
1531 else:
-> 1532 return self._call_impl(*args, **kwargs)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/torch/nn/modules/module.py:1541, in Module._call_impl(self, *args, **kwargs)
1536 # If we don't have any hooks, we want to skip the rest of the logic in
1537 # this function, and just call forward.
1538 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1539 or _global_backward_pre_hooks or _global_backward_hooks
1540 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1541 return forward_call(*args, **kwargs)
1543 try:
1544 result = None
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/transformers/models/siglip2/modeling_siglip2.py:1443, in Siglip2Model.forward(self, input_ids, pixel_values, pixel_attention_mask, spatial_shapes, attention_mask, position_ids, return_loss, output_attentions, output_hidden_states, return_dict)
1432 return_dict = return_dict if return_dict is not None else self.config.use_return_dict
1434 vision_outputs = self.vision_model(
1435 pixel_values=pixel_values,
1436 attention_mask=pixel_attention_mask,
(...)
1440 return_dict=return_dict,
1441 )
-> 1443 text_outputs = self.text_model(
1444 input_ids=input_ids,
1445 attention_mask=attention_mask,
1446 position_ids=position_ids,
1447 output_attentions=output_attentions,
1448 output_hidden_states=output_hidden_states,
1449 return_dict=return_dict,
1450 )
1452 image_embeds = vision_outputs[1]
1453 text_embeds = text_outputs[1]
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/torch/nn/modules/module.py:1532, in Module._wrapped_call_impl(self, *args, **kwargs)
1530 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]
1531 else:
-> 1532 return self._call_impl(*args, **kwargs)
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/torch/nn/modules/module.py:1541, in Module._call_impl(self, *args, **kwargs)
1536 # If we don't have any hooks, we want to skip the rest of the logic in
1537 # this function, and just call forward.
1538 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
1539 or _global_backward_pre_hooks or _global_backward_hooks
1540 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1541 return forward_call(*args, **kwargs)
1543 try:
1544 result = None
File /Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/transformers/models/siglip2/modeling_siglip2.py:928, in Siglip2TextTransformer.forward(self, input_ids, attention_mask, position_ids, output_attentions, output_hidden_states, return_dict)
925 return_dict = return_dict if return_dict is not None else self.config.use_return_dict
927 if input_ids is None:
--> 928 raise ValueError("You have to specify input_ids")
930 input_shape = input_ids.size()
931 input_ids = input_ids.view(-1, input_shape[-1])
ValueError: You have to specify input_ids
'''
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37079/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37079/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37078
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37078/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37078/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37078/events
|
https://github.com/huggingface/transformers/issues/37078
| 2,956,451,129
|
I_kwDOCUB6oc6wN905
| 37,078
|
Do not update cache when use_cache=False and past_key_values are provided?
|
{
"login": "PheelaV",
"id": 34603115,
"node_id": "MDQ6VXNlcjM0NjAzMTE1",
"avatar_url": "https://avatars.githubusercontent.com/u/34603115?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PheelaV",
"html_url": "https://github.com/PheelaV",
"followers_url": "https://api.github.com/users/PheelaV/followers",
"following_url": "https://api.github.com/users/PheelaV/following{/other_user}",
"gists_url": "https://api.github.com/users/PheelaV/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PheelaV/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PheelaV/subscriptions",
"organizations_url": "https://api.github.com/users/PheelaV/orgs",
"repos_url": "https://api.github.com/users/PheelaV/repos",
"events_url": "https://api.github.com/users/PheelaV/events{/privacy}",
"received_events_url": "https://api.github.com/users/PheelaV/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-28T15:12:33
| 2025-04-19T14:14:47
| null |
NONE
| null | null | null | null |
### Feature request
Hi, currently I am doing some training and I have a use case where I want to reuse the KV cache for the context without having to resolve to a deep copy or truncating the cache after each generation.
### Motivation
In the decoder, `.update` is called even in cases where use_cache = False, that is to concatenate the existing cache with new k/v projections, even though kv_cache is not returned (and my naive assumption is that the updated cache is not used after the decoders). I want to use cache, but I do not want it returned nor updated. In the case of Qwen2, the updated cache is never used within the same forward again. Hence this creates a side effect which is undesirable. I am creating a feature request and not a bug issue as I do not think it has negative effects for the current API usage. Getting rid of this side effect would benefit the scenario where you want to reuse the exact same cache.
This is not a suggestion, more of a clarification as to what side effect I am addressing here, I myself use this as a patch for the time being so I do not have to patch the model. After setting `self.update_cache` I will either return the concatenation of the current cache and the new k/v, or I will just return the new k/v pairs if there is no previous cache:
```python
def update(
self,
key_states: torch.Tensor,
value_states: torch.Tensor,
layer_idx: int,
cache_kwargs: Optional[Dict[str, Any]] = None,
) -> Tuple[torch.Tensor, torch.Tensor]:
"""
Updates the cache with the new `key_states` and `value_states` for the layer `layer_idx`.
Parameters:
key_states (`torch.Tensor`):
The new key states to cache.
value_states (`torch.Tensor`):
The new value states to cache.
layer_idx (`int`):
The index of the layer to cache the states for.
cache_kwargs (`Dict[str, Any]`, `optional`):
Additional arguments for the cache subclass. No additional arguments are used in `DynamicCache`.
Return:
A tuple containing the updated key and value states.
"""
# Update the number of seen tokens
if layer_idx == 0:
self._seen_tokens += key_states.shape[-2]
# Update the cache
if key_states is not None:
if len(self.key_cache) <= layer_idx:
# There may be skipped layers, fill them with empty lists
for _ in range(len(self.key_cache), layer_idx):
self.key_cache.append([])
self.value_cache.append([])
self.key_cache.append(key_states)
self.value_cache.append(value_states)
elif (
len(self.key_cache[layer_idx]) == 0 and self.update_cache
): # fills previously skipped layers; checking for tensor causes errors
self.key_cache[layer_idx] = key_states
self.value_cache[layer_idx] = value_states
elif (
len(self.key_cache[layer_idx]) == 0 and not self.update_cache
): # fills previously skipped layers; checking for tensor causes errors
return key_states, value_states
elif self.update_cache:
self.key_cache[layer_idx] = torch.cat([self.key_cache[layer_idx], key_states], dim=-2)
self.value_cache[layer_idx] = torch.cat([self.value_cache[layer_idx], value_states], dim=-2)
else:
new_keys = torch.cat([self.key_cache[layer_idx], key_states], dim=-2)
new_values = torch.cat([self.value_cache[layer_idx], value_states], dim=-2)
return new_keys, new_values
return self.key_cache[layer_idx], self.value_cache[layer_idx]
```
For the suggestion, the update could be split into update and retrieve, where update would be used in case of `use_cache == True` and `retrieve_cache` when `use_cache == False` and `past_key_values is not None`.
I only inspected the implementation of Qwen2, I am not using gradient checkpointing, and I am only using DynamicCache for the time being, that is to say the alternative combinations have not been considered. My question upon the community and HF team: Is there a good reason besides simplicity, not to do this?
Kindly tagging only @gante as not to oversubscribe the HF team.
### Your contribution
I would be happy to contribute, depending on the scope and after initial sanity check.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37078/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37078/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37077
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37077/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37077/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37077/events
|
https://github.com/huggingface/transformers/pull/37077
| 2,956,427,296
|
PR_kwDOCUB6oc6QkuiE
| 37,077
|
Fix: Unexpected Keys, Improve `run_compressed`, Rename Test Folder
|
{
"login": "rahul-tuli",
"id": 25380596,
"node_id": "MDQ6VXNlcjI1MzgwNTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/25380596?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rahul-tuli",
"html_url": "https://github.com/rahul-tuli",
"followers_url": "https://api.github.com/users/rahul-tuli/followers",
"following_url": "https://api.github.com/users/rahul-tuli/following{/other_user}",
"gists_url": "https://api.github.com/users/rahul-tuli/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rahul-tuli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rahul-tuli/subscriptions",
"organizations_url": "https://api.github.com/users/rahul-tuli/orgs",
"repos_url": "https://api.github.com/users/rahul-tuli/repos",
"events_url": "https://api.github.com/users/rahul-tuli/events{/privacy}",
"received_events_url": "https://api.github.com/users/rahul-tuli/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T15:04:29
| 2025-04-04T19:30:12
| 2025-04-04T19:30:12
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37077",
"html_url": "https://github.com/huggingface/transformers/pull/37077",
"diff_url": "https://github.com/huggingface/transformers/pull/37077.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37077.patch",
"merged_at": "2025-04-04T19:30:12"
}
|
In the latest release, the removal of `unexpected_keys` filtering for compressed tensors models reintroduced warnings that were previously resolved in [#36152](https://github.com/huggingface/transformers/pull/36152). This PR addresses that regression, enhances the user experience for the `run_compressed` flag, and updates the test folder naming to avoid conflicts and align with conventions.
#### Changes and Objectives
This pull request accomplishes three key improvements:
1. **Restores Filtering of Unexpected Keys for Compressed Tensors Models**
- The removal of `unexpected_keys` filtering caused warnings to reappear when loading compressed tensors models. This PR reintroduces the necessary logic by adding `unexpected_keys = hf_quantizer.update_unexpected_keys(model, unexpected_keys, prefix)` in `modeling_utils.py`. This ensures unexpected keys are properly managed during model loading, eliminating warnings and restoring the behavior from [#36152](https://github.com/huggingface/transformers/pull/36152).
2. **Enhances User Experience for `run_compressed` Misconfiguration**
- Previously, setting `run_compressed=True` in unsupported cases (e.g., sparsified models or non-compressed quantized models) triggered a `ValueError` and halted execution. This PR improves this by:
- Adding checks in `quantizer_compressed_tensors.py` to identify unsupported scenarios (`is_sparsification_compressed` or `is_quantized` without `is_quantization_compressed`).
- Issuing a `logger.warn` message instead of raising an error, notifying users that `run_compressed` is unsupported for the given model type.
- Automatically setting `run_compressed=False` in these cases, allowing the process to proceed gracefully.
- This change enhances usability by replacing hard failures with warnings and safe fallbacks.
3. **Renames Test Folder to Avoid Name Collisions**
- The test folder `tests/quantization/compressed_tensors` has been renamed to `tests/quantization/compressed_tensors_integration`. This prevents potential name collisions when running `pytest`, ensuring smoother test execution. The new name also aligns with the naming conventions of other integration tests in the repository, improving consistency.
#### Impact
- **Fixes Regression**: Eliminates unexpected keys warnings for compressed tensors models, ensuring a seamless loading experience.
- **Better UX**: Replaces abrupt failures with warnings and automatic corrections for `run_compressed`, making the library more robust and user-friendly.
- **Improved Testing**: Avoids test execution issues.
#### Files Modified
- `src/transformers/modeling_utils.py`: Added `update_unexpected_keys` call to restore filtering.
- `src/transformers/quantizers/quantizer_compressed_tensors.py`: Updated `run_compressed` logic with warnings and overrides.
- `tests/quantization/compressed_tensors/*`: Renamed folder to `compressed_tensors_integration` (including `__init__.py`, `test_compressed_models.py`, and `test_compressed_tensors.py`).
Absolutely! Here's the full markdown with both the **script** and the **output** wrapped in collapsible `<details>` blocks for a clean and structured PR description:
---
### Local Testing
This test verifies that compressed and uncompressed models can be loaded using `AutoModelForCausalLM` with various `run_compressed` settings. It also surfaces any warnings, decompression events, or fallbacks.
<details>
<summary><strong>Loading Script</strong></summary>
```python
from transformers import AutoModelForCausalLM
from transformers.utils.quantization_config import CompressedTensorsConfig
import traceback
# List of model stubs to test
model_stubs = [
"nm-testing/llama2.c-stories42M-gsm8k-quantized-only-compressed",
"nm-testing/llama2.c-stories42M-gsm8k-quantized-only-uncompressed",
"nm-testing/llama2.c-stories42M-gsm8k-sparse-only-compressed",
"nm-testing/llama2.c-stories42M-gsm8k-sparse-only-uncompressed",
"nm-testing/llama2.c-stories42M-gsm8k-stacked-compressed",
"nm-testing/llama2.c-stories42M-gsm8k-stacked-uncompressed",
]
print("\n=== Starting Model Load Tests ===\n")
for stub in model_stubs:
print("=" * 40)
print(f"Testing model stub: {stub}")
# Infer model style (check 'uncompressed' before 'compressed')
style = "uncompressed" if "uncompressed" in stub else "compressed"
for run_compressed in [True, False]:
print(f"\n→ Attempting load with run_compressed={run_compressed} (model style: {style})")
try:
model = AutoModelForCausalLM.from_pretrained(
stub,
torch_dtype="auto",
device_map="auto",
quantization_config=CompressedTensorsConfig(run_compressed=run_compressed),
)
print(f"✓ Successfully loaded ({style}, run_compressed={run_compressed})")
except Exception:
print(f"✗ Failed to load ({style}, run_compressed={run_compressed})")
print("Traceback:")
traceback.print_exc()
print(f"\nFinished testing: {stub}")
print("=" * 40 + "\n")
print("=== ✅ All Tests Completed ===")
```
</details>
---
<details>
<summary><strong>Test Output</strong></summary>
```
=== Starting Model Load Tests ===
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-quantized-only-compressed
→ Attempting load with run_compressed=True (model style: compressed)
2025-04-02 02:31:01.841947: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-04-02 02:31:01.882386: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2025-04-02 02:31:01.882424: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2025-04-02 02:31:01.883778: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2025-04-02 02:31:01.890147: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI AVX512_BF16 AVX_VNNI AMX_TILE AMX_INT8 AMX_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2025-04-02 02:31:02.655687: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
/home/rahul/upstream-transformers/src/transformers/quantizers/auto.py:212: UserWarning: You passed `quantization_config` or equivalent parameters to `from_pretrained` but the model you're loading already has a `quantization_config` attribute. The `quantization_config` from the model will be used.However, loading attributes (e.g. ['run_compressed']) will be overwritten with the one you passed to `from_pretrained`. The rest will be ignored.
warnings.warn(warning_msg)
✓ Successfully loaded (compressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: compressed)
Decompressing model: 56it [00:00, 996.06it/s]
✓ Successfully loaded (compressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-quantized-only-compressed
========================================
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-quantized-only-uncompressed
→ Attempting load with run_compressed=True (model style: uncompressed)
`run_compressed` is only supported for compressed models.Setting `run_compressed=False`
✓ Successfully loaded (uncompressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: uncompressed)
✓ Successfully loaded (uncompressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-quantized-only-uncompressed
========================================
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-sparse-only-compressed
→ Attempting load with run_compressed=True (model style: compressed)
`run_compressed` is only supported for quantized_compressed models and not for sparsified models. Setting `run_compressed=False`
Decompressing model: 75it [00:00, 631.24it/s]
✓ Successfully loaded (compressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: compressed)
Decompressing model: 75it [00:00, 618.48it/s]
✓ Successfully loaded (compressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-sparse-only-compressed
========================================
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-sparse-only-uncompressed
→ Attempting load with run_compressed=True (model style: uncompressed)
✓ Successfully loaded (uncompressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: uncompressed)
✓ Successfully loaded (uncompressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-sparse-only-uncompressed
========================================
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-stacked-compressed
→ Attempting load with run_compressed=True (model style: compressed)
`run_compressed` is only supported for quantized_compressed models and not for sparsified models. Setting `run_compressed=False`
Decompressing model: 131it [00:00, 278.01it/s]
Decompressing model: 56it [00:00, 5845.43it/s]
✓ Successfully loaded (compressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: compressed)
Decompressing model: 131it [00:00, 878.16it/s]
Decompressing model: 56it [00:00, 12393.47it/s]
✓ Successfully loaded (compressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-stacked-compressed
========================================
========================================
Testing model stub: nm-testing/llama2.c-stories42M-gsm8k-stacked-uncompressed
→ Attempting load with run_compressed=True (model style: uncompressed)
`run_compressed` is only supported for compressed models.Setting `run_compressed=False`
✓ Successfully loaded (uncompressed, run_compressed=True)
→ Attempting load with run_compressed=False (model style: uncompressed)
✓ Successfully loaded (uncompressed, run_compressed=False)
Finished testing: nm-testing/llama2.c-stories42M-gsm8k-stacked-uncompressed
========================================
=== ✅ All Tests Completed ===
```
</details>
### Cases when `run_compressed=True` is not supported and overridden to `False`
- Model is uncompressed
- Model is sparse (both sparse and sparse quantized)
| Model ID | Run Compressed | Overridden Columns |
|----------------------------------------------------|----------------|-------------------------------------|
| nm-testing/llama2.c-stories42M-gsm8k-quantized-only-compressed | True | None |
| nm-testing/llama2.c-stories42M-gsm8k-quantized-only-compressed | False | None |
| nm-testing/llama2.c-stories42M-gsm8k-quantized-only-uncompressed | True | `run_compressed` set to False |
| nm-testing/llama2.c-stories42M-gsm8k-quantized-only-uncompressed | False | None |
| nm-testing/llama2.c-stories42M-gsm8k-sparse-only-compressed | True | `run_compressed` set to False |
| nm-testing/llama2.c-stories42M-gsm8k-sparse-only-compressed | False | None |
| nm-testing/llama2.c-stories42M-gsm8k-sparse-only-uncompressed | True | `run_compressed` set to False |
| nm-testing/llama2.c-stories42M-gsm8k-sparse-only-uncompressed | False | None |
| nm-testing/llama2.c-stories42M-gsm8k-stacked-compressed | True | `run_compressed` set to False |
| nm-testing/llama2.c-stories42M-gsm8k-stacked-compressed | False | None |
| nm-testing/llama2.c-stories42M-gsm8k-stacked-uncompressed | True | `run_compressed` set to False |
| nm-testing/llama2.c-stories42M-gsm8k-stacked-uncompressed | False | None |
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37077/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37077/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37076
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37076/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37076/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37076/events
|
https://github.com/huggingface/transformers/pull/37076
| 2,956,252,308
|
PR_kwDOCUB6oc6QkIV3
| 37,076
|
Improvements in Gemma2 model card
|
{
"login": "devesh-2002",
"id": 79015420,
"node_id": "MDQ6VXNlcjc5MDE1NDIw",
"avatar_url": "https://avatars.githubusercontent.com/u/79015420?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/devesh-2002",
"html_url": "https://github.com/devesh-2002",
"followers_url": "https://api.github.com/users/devesh-2002/followers",
"following_url": "https://api.github.com/users/devesh-2002/following{/other_user}",
"gists_url": "https://api.github.com/users/devesh-2002/gists{/gist_id}",
"starred_url": "https://api.github.com/users/devesh-2002/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/devesh-2002/subscriptions",
"organizations_url": "https://api.github.com/users/devesh-2002/orgs",
"repos_url": "https://api.github.com/users/devesh-2002/repos",
"events_url": "https://api.github.com/users/devesh-2002/events{/privacy}",
"received_events_url": "https://api.github.com/users/devesh-2002/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T13:55:35
| 2025-04-08T02:47:27
| 2025-04-07T17:51:26
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37076",
"html_url": "https://github.com/huggingface/transformers/pull/37076",
"diff_url": "https://github.com/huggingface/transformers/pull/37076.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37076.patch",
"merged_at": "2025-04-07T17:51:26"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #36979,
This PR aims to improve model card for Gemma2 based on the given format mentioned [here](https://github.com/huggingface/transformers/issues/36979).
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu, Please let me know, if there are any changes needed here.
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37076/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37076/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37075
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37075/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37075/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37075/events
|
https://github.com/huggingface/transformers/pull/37075
| 2,956,109,240
|
PR_kwDOCUB6oc6QjoWT
| 37,075
|
Improve more loss computations
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T13:06:24
| 2025-03-28T15:05:57
| 2025-03-28T14:49:05
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37075",
"html_url": "https://github.com/huggingface/transformers/pull/37075",
"diff_url": "https://github.com/huggingface/transformers/pull/37075.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37075.patch",
"merged_at": null
}
|
Combine continuous with to.
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37075/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37075/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37074
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37074/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37074/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37074/events
|
https://github.com/huggingface/transformers/issues/37074
| 2,956,097,374
|
I_kwDOCUB6oc6wMnde
| 37,074
|
A TypeError in modeling_utils.caching_allocator_warmup function
|
{
"login": "ZeroMakesAll",
"id": 93301968,
"node_id": "U_kgDOBY-s0A",
"avatar_url": "https://avatars.githubusercontent.com/u/93301968?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZeroMakesAll",
"html_url": "https://github.com/ZeroMakesAll",
"followers_url": "https://api.github.com/users/ZeroMakesAll/followers",
"following_url": "https://api.github.com/users/ZeroMakesAll/following{/other_user}",
"gists_url": "https://api.github.com/users/ZeroMakesAll/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZeroMakesAll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZeroMakesAll/subscriptions",
"organizations_url": "https://api.github.com/users/ZeroMakesAll/orgs",
"repos_url": "https://api.github.com/users/ZeroMakesAll/repos",
"events_url": "https://api.github.com/users/ZeroMakesAll/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZeroMakesAll/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T13:01:19
| 2025-04-02T13:58:39
| 2025-04-02T13:58:39
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.50.2
- Platform: Linux-5.15.0-1040-nvidia-x86_64-with-glibc2.35
- Python version: 3.12.9
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <yes>
- Using GPU in script?: <yes>
- GPU type: NVIDIA H800
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Save bool values in model params
2. Load model use <device_map="auto">
3. An error occurred in modeling_utils.caching_allocator_warmup (line 5854), because one bool value takes 1/8 byte and then the type of byte_count is float
### Expected behavior
Before allocating video memory, do a type check on the byte_count
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37074/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37074/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37073
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37073/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37073/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37073/events
|
https://github.com/huggingface/transformers/pull/37073
| 2,955,996,762
|
PR_kwDOCUB6oc6QjQZ7
| 37,073
|
[don't merge] check env
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T12:18:21
| 2025-03-28T15:34:48
| 2025-03-28T15:34:48
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37073",
"html_url": "https://github.com/huggingface/transformers/pull/37073",
"diff_url": "https://github.com/huggingface/transformers/pull/37073.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37073.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37073/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37073/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37072
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37072/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37072/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37072/events
|
https://github.com/huggingface/transformers/pull/37072
| 2,955,683,593
|
PR_kwDOCUB6oc6QiLc6
| 37,072
|
Reverse dependency map shouldn't be created when test_all is set
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-28T10:22:28
| 2025-03-28T10:22:43
| null |
MEMBER
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37072",
"html_url": "https://github.com/huggingface/transformers/pull/37072",
"diff_url": "https://github.com/huggingface/transformers/pull/37072.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37072.patch",
"merged_at": null
}
| null | null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37072/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37072/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37071
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37071/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37071/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37071/events
|
https://github.com/huggingface/transformers/pull/37071
| 2,955,653,834
|
PR_kwDOCUB6oc6QiExo
| 37,071
|
Add Fast Conditional-DETR Processor
|
{
"login": "keetrap",
"id": 103131112,
"node_id": "U_kgDOBiWn6A",
"avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keetrap",
"html_url": "https://github.com/keetrap",
"followers_url": "https://api.github.com/users/keetrap/followers",
"following_url": "https://api.github.com/users/keetrap/following{/other_user}",
"gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keetrap/subscriptions",
"organizations_url": "https://api.github.com/users/keetrap/orgs",
"repos_url": "https://api.github.com/users/keetrap/repos",
"events_url": "https://api.github.com/users/keetrap/events{/privacy}",
"received_events_url": "https://api.github.com/users/keetrap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
},
{
"id": 7570656740,
"node_id": "LA_kwDOCUB6oc8AAAABwz8N5A",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Processing",
"name": "Processing",
"color": "1E17DF",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T10:12:26
| 2025-04-15T16:33:35
| 2025-04-15T16:33:34
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37071",
"html_url": "https://github.com/huggingface/transformers/pull/37071",
"diff_url": "https://github.com/huggingface/transformers/pull/37071.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37071.patch",
"merged_at": "2025-04-15T16:33:34"
}
|
Related #36978
cc @yonigozlan
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37071/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37070
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37070/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37070/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37070/events
|
https://github.com/huggingface/transformers/pull/37070
| 2,955,613,556
|
PR_kwDOCUB6oc6Qh7tr
| 37,070
|
Detect and fix most `_init_weights()` issues - make it work for composite models
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T09:59:27
| 2025-04-28T12:57:58
| 2025-04-14T14:19:04
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37070",
"html_url": "https://github.com/huggingface/transformers/pull/37070",
"diff_url": "https://github.com/huggingface/transformers/pull/37070.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37070.patch",
"merged_at": "2025-04-14T14:19:04"
}
|
# What does this PR do?
This is a follow-up of https://github.com/huggingface/transformers/pull/36963.
This PR makes `_init_weights` work seamlessly with composite models. Until this point, composite models would only use the `_init_weights` of the outer-most `PreTrainedModel` wrapper, leading to errors or skipped modules. Now, sub-models are correctly initialized according to their own `_init_weights`, without any overhead. This is increasingly important as most recent models are now multimodal.
Without this change, every composite model would have to recurse a second time on all sub-models explicitly in the outer-most `_init_weights`, which is extremely error prone and inefficient. E.g., we would need to do one or the other of the following in the outer-most `_init_weights`:
```python
# FIRST BAD OPTION
def _init_weights(self, module):
std = self.config.initializer_range
# for each module in the model, check the whole module list of the submodel (very inefficient)
if module in self.vision_tower.modules():
self.vision_tower._init_weights(module)
# similar for the other sub-model
elif module in self.language_model.modules():
self.language_model._init_weights(module)
# usual init block for only the modules external to the sub-models
elif isinstance(module, nn.Linear):
...
# OR EQUALLY INEFFICIENT
def _init_weights(self, module):
std = self.config.initializer_range
# Here, as `apply` is depth-first graph traversal, every module will be initialized a first time, then re-initialized
# a second time (extremely inefficient as well)
if module is self.vision_tower:
self.vision_tower.apply(self.vision_tower._init_weights)
# similar for the other sub-model
elif module is self.language_model:
self.language_model.apply(self.language_model._init_weights)
# usual init block for only the modules external to the sub-models
elif isinstance(module, nn.Linear):
...
```
This PR allows to simply do
```python
def _init_weights(self, module):
std = self.config.initializer_range
# usual init block for only the modules external to the sub-models
if isinstance(module, nn.Linear):
...
```
and have all submodels correctly initialized automatically.
Also, enforce `torch.no_grad()` for initialization, which was not the case before and would slow down the process.
Finally, fix the `_init_weights` of a LOT of models, the most important ones (the most recent ones, and the ones with the flag `_supports_cache_class=True`) for now.
The reason not to do them all is simply that there are too much to fix. Almost all models in the library have broken `_ init_weights` 🙃
We'll patch incrementally. **In the meantime, the added test will enforce that new models are correct.**
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37070/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37070/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37069
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37069/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37069/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37069/events
|
https://github.com/huggingface/transformers/pull/37069
| 2,955,562,882
|
PR_kwDOCUB6oc6QhwJF
| 37,069
|
🌐 [i18n-KO] Translated `roberta.md` to Korean
|
{
"login": "garongkim",
"id": 97512668,
"node_id": "U_kgDOBc_s3A",
"avatar_url": "https://avatars.githubusercontent.com/u/97512668?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/garongkim",
"html_url": "https://github.com/garongkim",
"followers_url": "https://api.github.com/users/garongkim/followers",
"following_url": "https://api.github.com/users/garongkim/following{/other_user}",
"gists_url": "https://api.github.com/users/garongkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/garongkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/garongkim/subscriptions",
"organizations_url": "https://api.github.com/users/garongkim/orgs",
"repos_url": "https://api.github.com/users/garongkim/repos",
"events_url": "https://api.github.com/users/garongkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/garongkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T09:44:32
| 2025-04-24T17:00:25
| 2025-04-24T17:00:25
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37069",
"html_url": "https://github.com/huggingface/transformers/pull/37069",
"diff_url": "https://github.com/huggingface/transformers/pull/37069.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37069.patch",
"merged_at": "2025-04-24T17:00:25"
}
|
<!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `roberta.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [ ] Check for missing / redundant translations (번역 누락/중복 검사)
- [ ] Grammar Check (맞춤법 검사)
- [ ] Review or Add new terms to glossary (용어 확인 및 추가)
- [ ] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [ ] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 본인의 조 팀원들에게 리뷰 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
<!-- @cjfghk5697, @yijun-lee, @rlaalsrl0922 , @MinJu-Ha -->
<!-- @cjfghk5697, @yijun-lee, @devxaitist, @nsbg -->
<!-- @cjfghk5697, @yijun-lee, @Kim-Ju-won @junhkang @olccihyeon -->
<!-- @cjfghk5697, @yijun-lee, @garongkim @maximizemaxwell -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. N조 팀원들과 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? -->
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37069/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37069/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37068
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37068/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37068/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37068/events
|
https://github.com/huggingface/transformers/pull/37068
| 2,955,560,524
|
PR_kwDOCUB6oc6QhvoC
| 37,068
|
[blip-2] Fix dtype mismatch when keep in fp32
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T09:43:37
| 2025-03-28T14:53:12
| 2025-03-28T14:52:11
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37068",
"html_url": "https://github.com/huggingface/transformers/pull/37068",
"diff_url": "https://github.com/huggingface/transformers/pull/37068.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37068.patch",
"merged_at": "2025-03-28T14:52:11"
}
|
# What does this PR do?
Seems that BLIP2 `keep_in_fp32_modules` was not picking up correctly until [36722](https://github.com/huggingface/transformers/pull/36722) (thus no errors up to today). Now that the `query_tokens` are actually kept in 32, we are getting dtype mismatch error in inference. (reported by @hmellor)
This PR fixes by casting inputs to correct dtypes, and keeping `qformer` in `fp32` as well. The fix was tested with slow BLIP-2 tests and vLLM tests
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37068/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37068/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37067
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37067/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37067/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37067/events
|
https://github.com/huggingface/transformers/pull/37067
| 2,955,540,230
|
PR_kwDOCUB6oc6QhrKU
| 37,067
|
Fix 4090/ada not detected as having FP8 support
|
{
"login": "Qubitium",
"id": 417764,
"node_id": "MDQ6VXNlcjQxNzc2NA==",
"avatar_url": "https://avatars.githubusercontent.com/u/417764?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Qubitium",
"html_url": "https://github.com/Qubitium",
"followers_url": "https://api.github.com/users/Qubitium/followers",
"following_url": "https://api.github.com/users/Qubitium/following{/other_user}",
"gists_url": "https://api.github.com/users/Qubitium/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Qubitium/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Qubitium/subscriptions",
"organizations_url": "https://api.github.com/users/Qubitium/orgs",
"repos_url": "https://api.github.com/users/Qubitium/repos",
"events_url": "https://api.github.com/users/Qubitium/events{/privacy}",
"received_events_url": "https://api.github.com/users/Qubitium/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T09:34:49
| 2025-03-31T08:53:56
| 2025-03-31T08:53:48
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37067",
"html_url": "https://github.com/huggingface/transformers/pull/37067",
"diff_url": "https://github.com/huggingface/transformers/pull/37067.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37067.patch",
"merged_at": "2025-03-31T08:53:48"
}
|
# What does this PR do?
FIX 4090/Ada not detected as having valid FP8 hw support.
Did a simple layer/module inference test with DeepSeek V3 0324 using GPTQModel and there is no issue with Transformer inference on 4090 at the layer/module level. 4090 does not have the memory to run a full all-layer generation.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @SunMarc @MekkCyber
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37067/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37067/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37066
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37066/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37066/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37066/events
|
https://github.com/huggingface/transformers/pull/37066
| 2,955,483,472
|
PR_kwDOCUB6oc6Qheab
| 37,066
|
refactor(audio_processing): replace pipe with temp files for FFmpeg processing
|
{
"login": "joeyhacker",
"id": 2774637,
"node_id": "MDQ6VXNlcjI3NzQ2Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2774637?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joeyhacker",
"html_url": "https://github.com/joeyhacker",
"followers_url": "https://api.github.com/users/joeyhacker/followers",
"following_url": "https://api.github.com/users/joeyhacker/following{/other_user}",
"gists_url": "https://api.github.com/users/joeyhacker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joeyhacker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joeyhacker/subscriptions",
"organizations_url": "https://api.github.com/users/joeyhacker/orgs",
"repos_url": "https://api.github.com/users/joeyhacker/repos",
"events_url": "https://api.github.com/users/joeyhacker/events{/privacy}",
"received_events_url": "https://api.github.com/users/joeyhacker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T09:15:28
| 2025-03-28T09:23:42
| 2025-03-28T09:22:31
|
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37066",
"html_url": "https://github.com/huggingface/transformers/pull/37066",
"diff_url": "https://github.com/huggingface/transformers/pull/37066.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37066.patch",
"merged_at": null
}
|
Key reasons for this change:
1. MP4 container format requires random access to metadata (moov atoms)
which is often located at the end of the file. Pipe streaming makes
this access pattern impossible.
2. FFmpeg's format detection works more reliably with physical files
compared to streamed input via pipes.
3. Temporary files provide better error diagnostics since the input can
be preserved for debugging when failures occur.
4. Some FFmpeg codecs and filters behave differently with pipe input
versus file input due to buffering differences.
The new implementation:
- Creates properly named temporary files with correct extensions
- Uses atomic write operations with flush()
- Implements comprehensive cleanup in finally blocks
- Provides better error messages when failures occur
This fixes issues with partial file errors ("offset 0x3f9: partial file")
that occurred during demuxing of m4a files in the pipe-based approach.
|
{
"login": "joeyhacker",
"id": 2774637,
"node_id": "MDQ6VXNlcjI3NzQ2Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2774637?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joeyhacker",
"html_url": "https://github.com/joeyhacker",
"followers_url": "https://api.github.com/users/joeyhacker/followers",
"following_url": "https://api.github.com/users/joeyhacker/following{/other_user}",
"gists_url": "https://api.github.com/users/joeyhacker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joeyhacker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joeyhacker/subscriptions",
"organizations_url": "https://api.github.com/users/joeyhacker/orgs",
"repos_url": "https://api.github.com/users/joeyhacker/repos",
"events_url": "https://api.github.com/users/joeyhacker/events{/privacy}",
"received_events_url": "https://api.github.com/users/joeyhacker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37066/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37066/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37065
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37065/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37065/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37065/events
|
https://github.com/huggingface/transformers/pull/37065
| 2,955,199,678
|
PR_kwDOCUB6oc6QggPA
| 37,065
|
Update model card for Depth Anything
|
{
"login": "shubham0204",
"id": 41076823,
"node_id": "MDQ6VXNlcjQxMDc2ODIz",
"avatar_url": "https://avatars.githubusercontent.com/u/41076823?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shubham0204",
"html_url": "https://github.com/shubham0204",
"followers_url": "https://api.github.com/users/shubham0204/followers",
"following_url": "https://api.github.com/users/shubham0204/following{/other_user}",
"gists_url": "https://api.github.com/users/shubham0204/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shubham0204/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shubham0204/subscriptions",
"organizations_url": "https://api.github.com/users/shubham0204/orgs",
"repos_url": "https://api.github.com/users/shubham0204/repos",
"events_url": "https://api.github.com/users/shubham0204/events{/privacy}",
"received_events_url": "https://api.github.com/users/shubham0204/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T06:54:31
| 2025-04-04T18:36:06
| 2025-04-04T18:36:05
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37065",
"html_url": "https://github.com/huggingface/transformers/pull/37065",
"diff_url": "https://github.com/huggingface/transformers/pull/37065.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37065.patch",
"merged_at": "2025-04-04T18:36:05"
}
|
# What does this PR do?
This PR updates the model-card for the `depth_anything` model, as described in #36979, in an attempt to standardize all model-cards.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37065/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37065/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37064
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37064/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37064/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37064/events
|
https://github.com/huggingface/transformers/issues/37064
| 2,955,041,646
|
I_kwDOCUB6oc6wIltu
| 37,064
|
a logic error in _preprocess function of Qwen2VLImageProcessor Class
|
{
"login": "InsaneGe",
"id": 75520204,
"node_id": "MDQ6VXNlcjc1NTIwMjA0",
"avatar_url": "https://avatars.githubusercontent.com/u/75520204?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/InsaneGe",
"html_url": "https://github.com/InsaneGe",
"followers_url": "https://api.github.com/users/InsaneGe/followers",
"following_url": "https://api.github.com/users/InsaneGe/following{/other_user}",
"gists_url": "https://api.github.com/users/InsaneGe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/InsaneGe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/InsaneGe/subscriptions",
"organizations_url": "https://api.github.com/users/InsaneGe/orgs",
"repos_url": "https://api.github.com/users/InsaneGe/repos",
"events_url": "https://api.github.com/users/InsaneGe/events{/privacy}",
"received_events_url": "https://api.github.com/users/InsaneGe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T05:09:01
| 2025-05-14T11:04:35
| 2025-05-11T08:03:25
|
NONE
| null | null | null | null |
### System Info
in the _preprocess function of Qwen2VLImageProcessor Class(https://github.com/huggingface/transformers/blob/348f3285c5114159d2ff4933b4b8ae36866d01a7/src/transformers/models/qwen2_vl/image_processing_qwen2_vl.py#L278), it writes down as follows:
```python
if patches.shape[0] % temporal_patch_size != 0:
repeats = np.repeat(patches[-1][np.newaxis], temporal_patch_size - 1, axis=0)
patches = np.concatenate([patches, repeats], axis=0)
grid_t = patches.shape[0] // temporal_patch_size
```
it should repeat `temporal_patch_size - (patches.shape[0] % temporal_patch_size)` instead of temporal_patch_size - 1, to make sure patches.shape[0] can be divisible by temporal_patch_size.
```python
if patches.shape[0] % temporal_patch_size != 0:
repeats = np.repeat(patches[-1][np.newaxis], temporal_patch_size- (patches.shape[0] % temporal_patch_size), axis=0)
patches = np.concatenate([patches, repeats], axis=0)
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
nothing
### Expected behavior
nothing
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37064/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37063
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37063/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37063/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37063/events
|
https://github.com/huggingface/transformers/pull/37063
| 2,954,991,666
|
PR_kwDOCUB6oc6Qf0OK
| 37,063
|
Update model card for electra
|
{
"login": "Wu-n0",
"id": 86141988,
"node_id": "MDQ6VXNlcjg2MTQxOTg4",
"avatar_url": "https://avatars.githubusercontent.com/u/86141988?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wu-n0",
"html_url": "https://github.com/Wu-n0",
"followers_url": "https://api.github.com/users/Wu-n0/followers",
"following_url": "https://api.github.com/users/Wu-n0/following{/other_user}",
"gists_url": "https://api.github.com/users/Wu-n0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Wu-n0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Wu-n0/subscriptions",
"organizations_url": "https://api.github.com/users/Wu-n0/orgs",
"repos_url": "https://api.github.com/users/Wu-n0/repos",
"events_url": "https://api.github.com/users/Wu-n0/events{/privacy}",
"received_events_url": "https://api.github.com/users/Wu-n0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T04:34:41
| 2025-04-03T20:59:09
| 2025-04-03T17:45:35
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37063",
"html_url": "https://github.com/huggingface/transformers/pull/37063",
"diff_url": "https://github.com/huggingface/transformers/pull/37063.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37063.patch",
"merged_at": "2025-04-03T17:45:35"
}
|
# What does this PR do?
This PR updates the ELECTRA model card to follow the standardized format outlined in issue #36979. The updated model card includes:
- A conversational description of how ELECTRA works and what makes it unique
- Code examples for using ELECTRA with both Pipeline and AutoModel
- A quantization example for the large model
- Detailed usage notes with practical tips
Note: I did not include a transformers-cli example because ELECTRA is a discriminative model for classification tasks rather than text generation, making it less suitable for CLI interaction.
I also couldn't include an AttentionMaskVisualizer example because it's not currently supported with ELECTRA. When attempting to implement it locally, I encountered this error:
`AttributeError: '_ModelWrapper' object has no attribute '_update_causal_mask'`
## Before submitting
- [X] This PR fixes a typo or improves the docs
## Who can review?
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37063/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37063/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37062
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37062/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37062/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37062/events
|
https://github.com/huggingface/transformers/pull/37062
| 2,954,957,404
|
PR_kwDOCUB6oc6Qfs6N
| 37,062
|
Add weights_only=True to torch.load
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T04:05:49
| 2025-04-11T16:48:12
| 2025-04-11T16:18:42
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37062",
"html_url": "https://github.com/huggingface/transformers/pull/37062",
"diff_url": "https://github.com/huggingface/transformers/pull/37062.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37062.patch",
"merged_at": "2025-04-11T16:18:42"
}
|
# What does this PR do?
Use weights_only=True in all code.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37062/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37061
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37061/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37061/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37061/events
|
https://github.com/huggingface/transformers/pull/37061
| 2,954,923,886
|
PR_kwDOCUB6oc6Qfly_
| 37,061
|
Skip inexistent tokenizer in tests
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T03:33:26
| 2025-03-28T10:41:21
| 2025-03-28T10:41:11
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37061",
"html_url": "https://github.com/huggingface/transformers/pull/37061",
"diff_url": "https://github.com/huggingface/transformers/pull/37061.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37061.patch",
"merged_at": null
}
|
Skip inexistent tokenisers.
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37061/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37061/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37060
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37060/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37060/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37060/events
|
https://github.com/huggingface/transformers/pull/37060
| 2,954,843,244
|
PR_kwDOCUB6oc6QfU5K
| 37,060
|
Fix more inefficient PT operations
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T02:25:00
| 2025-04-03T04:14:32
| 2025-03-31T15:31:25
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37060",
"html_url": "https://github.com/huggingface/transformers/pull/37060",
"diff_url": "https://github.com/huggingface/transformers/pull/37060.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37060.patch",
"merged_at": "2025-03-31T15:31:25"
}
|
Remove ``clone().detach()`` and unnecessary ``cpu()`` calls.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37060/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37060/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37059
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37059/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37059/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37059/events
|
https://github.com/huggingface/transformers/pull/37059
| 2,954,819,824
|
PR_kwDOCUB6oc6QfP4X
| 37,059
|
Remove deprecated code
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T02:05:17
| 2025-03-31T09:17:04
| 2025-03-31T09:15:35
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37059",
"html_url": "https://github.com/huggingface/transformers/pull/37059",
"diff_url": "https://github.com/huggingface/transformers/pull/37059.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37059.patch",
"merged_at": "2025-03-31T09:15:35"
}
|
Remove code deprecated before current version.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37059/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37059/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37058
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37058/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37058/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37058/events
|
https://github.com/huggingface/transformers/pull/37058
| 2,954,812,138
|
PR_kwDOCUB6oc6QfOT3
| 37,058
|
Update audio_utils.py
|
{
"login": "joeyhacker",
"id": 2774637,
"node_id": "MDQ6VXNlcjI3NzQ2Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2774637?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joeyhacker",
"html_url": "https://github.com/joeyhacker",
"followers_url": "https://api.github.com/users/joeyhacker/followers",
"following_url": "https://api.github.com/users/joeyhacker/following{/other_user}",
"gists_url": "https://api.github.com/users/joeyhacker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joeyhacker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joeyhacker/subscriptions",
"organizations_url": "https://api.github.com/users/joeyhacker/orgs",
"repos_url": "https://api.github.com/users/joeyhacker/repos",
"events_url": "https://api.github.com/users/joeyhacker/events{/privacy}",
"received_events_url": "https://api.github.com/users/joeyhacker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T01:59:01
| 2025-03-28T09:14:03
| 2025-03-28T09:14:03
|
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37058",
"html_url": "https://github.com/huggingface/transformers/pull/37058",
"diff_url": "https://github.com/huggingface/transformers/pull/37058.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37058.patch",
"merged_at": null
}
|
refactor(audio_processing): replace pipe with temp files for FFmpeg processing
This change replaces the previous pipe-based FFmpeg audio decoding approach
with temporary file operations to improve reliability, especially for
container formats like MP4/m4a.
Key reasons for this change:
1. MP4 container format requires random access to metadata (moov atoms) which is often located at the end of the file. Pipe streaming makes this access pattern impossible.
2. FFmpeg's format detection works more reliably with physical files compared to streamed input via pipes.
3. Temporary files provide better error diagnostics since the input can be preserved for debugging when failures occur.
4. Some FFmpeg codecs and filters behave differently with pipe input versus file input due to buffering differences.
The new implementation:
- Creates properly named temporary files with correct extensions
- Uses atomic write operations with flush()
- Implements comprehensive cleanup in finally blocks
- Provides better error messages when failures occur
This fixes issues with partial file errors ("offset 0x3f9: partial file")
that occurred during demuxing of m4a files in the pipe-based approach.
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "joeyhacker",
"id": 2774637,
"node_id": "MDQ6VXNlcjI3NzQ2Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2774637?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joeyhacker",
"html_url": "https://github.com/joeyhacker",
"followers_url": "https://api.github.com/users/joeyhacker/followers",
"following_url": "https://api.github.com/users/joeyhacker/following{/other_user}",
"gists_url": "https://api.github.com/users/joeyhacker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joeyhacker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joeyhacker/subscriptions",
"organizations_url": "https://api.github.com/users/joeyhacker/orgs",
"repos_url": "https://api.github.com/users/joeyhacker/repos",
"events_url": "https://api.github.com/users/joeyhacker/events{/privacy}",
"received_events_url": "https://api.github.com/users/joeyhacker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37058/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37058/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37057
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37057/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37057/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37057/events
|
https://github.com/huggingface/transformers/pull/37057
| 2,954,759,199
|
PR_kwDOCUB6oc6QfDem
| 37,057
|
fixed typo.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T01:04:29
| 2025-03-29T00:39:18
| 2025-03-28T17:12:14
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37057",
"html_url": "https://github.com/huggingface/transformers/pull/37057",
"diff_url": "https://github.com/huggingface/transformers/pull/37057.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37057.patch",
"merged_at": "2025-03-28T17:12:14"
}
|
# What does this PR do?
Follow pr : [quicky ](https://github.com/huggingface/transformers/commit/d6b3c7486b441296366f788fde57109337f63bca)
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37057/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37056
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37056/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37056/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37056/events
|
https://github.com/huggingface/transformers/pull/37056
| 2,954,709,094
|
PR_kwDOCUB6oc6Qe47W
| 37,056
|
Update model card for Cohere
|
{
"login": "bimal-gajera",
"id": 90305421,
"node_id": "MDQ6VXNlcjkwMzA1NDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/90305421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bimal-gajera",
"html_url": "https://github.com/bimal-gajera",
"followers_url": "https://api.github.com/users/bimal-gajera/followers",
"following_url": "https://api.github.com/users/bimal-gajera/following{/other_user}",
"gists_url": "https://api.github.com/users/bimal-gajera/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bimal-gajera/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bimal-gajera/subscriptions",
"organizations_url": "https://api.github.com/users/bimal-gajera/orgs",
"repos_url": "https://api.github.com/users/bimal-gajera/repos",
"events_url": "https://api.github.com/users/bimal-gajera/events{/privacy}",
"received_events_url": "https://api.github.com/users/bimal-gajera/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-28T00:15:05
| 2025-04-04T17:22:38
| 2025-04-03T16:51:41
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37056",
"html_url": "https://github.com/huggingface/transformers/pull/37056",
"diff_url": "https://github.com/huggingface/transformers/pull/37056.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37056.patch",
"merged_at": "2025-04-03T16:51:41"
}
|
# What does this PR do?
This PR updates the model card for Cohere and follows the template outlined in the issue. Please let me know if any changes are required.
As part of this update, I also replaced the outdated blog link [`https://txt.cohere.com/command-r/`](https://txt.cohere.com/command-r/) with the updated official blog link [`https://cohere.com/blog/command-r`](https://cohere.com/blog/command-r).
This PR addresses part of issue #36979 by updating the `cohere.md` model card.
## Before submitting
- [x] This PR improves the docs.
## Who can review?
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37056/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37055
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37055/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37055/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37055/events
|
https://github.com/huggingface/transformers/pull/37055
| 2,954,428,981
|
PR_kwDOCUB6oc6Qd7VS
| 37,055
|
Add EfficientNet Image PreProcessor
|
{
"login": "zshn25",
"id": 5270999,
"node_id": "MDQ6VXNlcjUyNzA5OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5270999?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zshn25",
"html_url": "https://github.com/zshn25",
"followers_url": "https://api.github.com/users/zshn25/followers",
"following_url": "https://api.github.com/users/zshn25/following{/other_user}",
"gists_url": "https://api.github.com/users/zshn25/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zshn25/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zshn25/subscriptions",
"organizations_url": "https://api.github.com/users/zshn25/orgs",
"repos_url": "https://api.github.com/users/zshn25/repos",
"events_url": "https://api.github.com/users/zshn25/events{/privacy}",
"received_events_url": "https://api.github.com/users/zshn25/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T21:11:19
| 2025-08-05T07:51:24
| 2025-04-16T19:59:24
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37055",
"html_url": "https://github.com/huggingface/transformers/pull/37055",
"diff_url": "https://github.com/huggingface/transformers/pull/37055.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37055.patch",
"merged_at": "2025-04-16T19:59:24"
}
|
# What does this PR do?
Add Fast Image Processor https://github.com/huggingface/transformers/issues/36978 for EfficientNet
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@yonigozlan
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37055/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37055/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37054
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37054/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37054/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37054/events
|
https://github.com/huggingface/transformers/pull/37054
| 2,954,278,363
|
PR_kwDOCUB6oc6Qda2W
| 37,054
|
(Part 2) feat: allow for tp_size attr for tplizing the model
|
{
"login": "kmehant",
"id": 15800200,
"node_id": "MDQ6VXNlcjE1ODAwMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/15800200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kmehant",
"html_url": "https://github.com/kmehant",
"followers_url": "https://api.github.com/users/kmehant/followers",
"following_url": "https://api.github.com/users/kmehant/following{/other_user}",
"gists_url": "https://api.github.com/users/kmehant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kmehant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmehant/subscriptions",
"organizations_url": "https://api.github.com/users/kmehant/orgs",
"repos_url": "https://api.github.com/users/kmehant/repos",
"events_url": "https://api.github.com/users/kmehant/events{/privacy}",
"received_events_url": "https://api.github.com/users/kmehant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T19:48:42
| 2025-04-10T16:08:18
| 2025-04-10T15:44:10
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37054",
"html_url": "https://github.com/huggingface/transformers/pull/37054",
"diff_url": "https://github.com/huggingface/transformers/pull/37054.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37054.patch",
"merged_at": "2025-04-10T15:44:10"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Discussed at https://github.com/huggingface/accelerate/pull/3457
1. Introduce `tp_size` to allow for TP sharding apart from world size
2. Make `tp_size` an attribute of the model only initialized after TP sharding completed which can be an indicator if the model has undergone tp sharding for usage in accelerate. (discussed with @SunMarc)
3. Remove `tp_size` from train arguments, since from now on it is to perform TP training only if the model has undergone TP sharding already.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
- trainer: @muellerzr and @SunMarc
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37054/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37054/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37053
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37053/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37053/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37053/events
|
https://github.com/huggingface/transformers/pull/37053
| 2,954,209,116
|
PR_kwDOCUB6oc6QdLH2
| 37,053
|
Add Idefics2 Fast ImageProcessor
|
{
"login": "sushmanthreddy",
"id": 73489688,
"node_id": "MDQ6VXNlcjczNDg5Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/73489688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sushmanthreddy",
"html_url": "https://github.com/sushmanthreddy",
"followers_url": "https://api.github.com/users/sushmanthreddy/followers",
"following_url": "https://api.github.com/users/sushmanthreddy/following{/other_user}",
"gists_url": "https://api.github.com/users/sushmanthreddy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sushmanthreddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sushmanthreddy/subscriptions",
"organizations_url": "https://api.github.com/users/sushmanthreddy/orgs",
"repos_url": "https://api.github.com/users/sushmanthreddy/repos",
"events_url": "https://api.github.com/users/sushmanthreddy/events{/privacy}",
"received_events_url": "https://api.github.com/users/sushmanthreddy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
},
{
"id": 7570656740,
"node_id": "LA_kwDOCUB6oc8AAAABwz8N5A",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Processing",
"name": "Processing",
"color": "1E17DF",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T19:22:06
| 2025-04-01T11:05:58
| 2025-04-01T10:42:23
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37053",
"html_url": "https://github.com/huggingface/transformers/pull/37053",
"diff_url": "https://github.com/huggingface/transformers/pull/37053.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37053.patch",
"merged_at": null
}
|
related #36978
|
{
"login": "sushmanthreddy",
"id": 73489688,
"node_id": "MDQ6VXNlcjczNDg5Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/73489688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sushmanthreddy",
"html_url": "https://github.com/sushmanthreddy",
"followers_url": "https://api.github.com/users/sushmanthreddy/followers",
"following_url": "https://api.github.com/users/sushmanthreddy/following{/other_user}",
"gists_url": "https://api.github.com/users/sushmanthreddy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sushmanthreddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sushmanthreddy/subscriptions",
"organizations_url": "https://api.github.com/users/sushmanthreddy/orgs",
"repos_url": "https://api.github.com/users/sushmanthreddy/repos",
"events_url": "https://api.github.com/users/sushmanthreddy/events{/privacy}",
"received_events_url": "https://api.github.com/users/sushmanthreddy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37053/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37053/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37052
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37052/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37052/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37052/events
|
https://github.com/huggingface/transformers/pull/37052
| 2,954,153,272
|
PR_kwDOCUB6oc6Qc-Ud
| 37,052
|
Update Model Card for ModernBERT
|
{
"login": "ParagEkbote",
"id": 69567729,
"node_id": "MDQ6VXNlcjY5NTY3NzI5",
"avatar_url": "https://avatars.githubusercontent.com/u/69567729?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParagEkbote",
"html_url": "https://github.com/ParagEkbote",
"followers_url": "https://api.github.com/users/ParagEkbote/followers",
"following_url": "https://api.github.com/users/ParagEkbote/following{/other_user}",
"gists_url": "https://api.github.com/users/ParagEkbote/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParagEkbote/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParagEkbote/subscriptions",
"organizations_url": "https://api.github.com/users/ParagEkbote/orgs",
"repos_url": "https://api.github.com/users/ParagEkbote/repos",
"events_url": "https://api.github.com/users/ParagEkbote/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParagEkbote/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T19:03:28
| 2025-04-04T00:21:20
| 2025-04-03T17:14:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37052",
"html_url": "https://github.com/huggingface/transformers/pull/37052",
"diff_url": "https://github.com/huggingface/transformers/pull/37052.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37052.patch",
"merged_at": "2025-04-03T17:14:03"
}
|
# What does this PR do?
As described in the issue, this PR updates the model card for ModernBERT. Please let me know if any modifications are required and I will make the necessary changes.
Fixes #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37052/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37052/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37051
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37051/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37051/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37051/events
|
https://github.com/huggingface/transformers/issues/37051
| 2,954,102,635
|
I_kwDOCUB6oc6wFAdr
| 37,051
|
Incorrect calculation of strides leading to loss of param data upon tensor parallel use while sliced model loading
|
{
"login": "kmehant",
"id": 15800200,
"node_id": "MDQ6VXNlcjE1ODAwMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/15800200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kmehant",
"html_url": "https://github.com/kmehant",
"followers_url": "https://api.github.com/users/kmehant/followers",
"following_url": "https://api.github.com/users/kmehant/following{/other_user}",
"gists_url": "https://api.github.com/users/kmehant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kmehant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmehant/subscriptions",
"organizations_url": "https://api.github.com/users/kmehant/orgs",
"repos_url": "https://api.github.com/users/kmehant/repos",
"events_url": "https://api.github.com/users/kmehant/events{/privacy}",
"received_events_url": "https://api.github.com/users/kmehant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2760822153,
"node_id": "MDU6TGFiZWwyNzYwODIyMTUz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Tensor%20Parallel",
"name": "Tensor Parallel",
"color": "1AD0A8",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T18:44:43
| 2025-07-01T10:03:23
| 2025-07-01T10:03:23
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.50.2
- Python version: 3.12.0
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: using TP
- Using GPU in script?: yes
- GPU type: NVIDIA A100-SXM4-80GB
### Who can help?
trainer: @muellerzr @SunMarc
TP: @ArthurZucker
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
You can load any model (For instance `ibm-granite/granite-3.1-2b-base`) with `tp_plan="auto"`. Make sure the vocab size is not divisible by world size (in this case 5) just for one sharding example, in this case we want to let TP shard lm_head with `colwise_rep`. You can then inspect size of the weight param of lm_head after application and it would be a multiple of world size however loses param data (in this case its downsized from 49152 to 49150).
### Expected behavior
Loss of data / param parity should hold after application of TP. This is essentially happening due to `get_tensor_shard()` function https://github.com/huggingface/transformers/blob/348f3285c5114159d2ff4933b4b8ae36866d01a7/src/transformers/integrations/tensor_parallel.py#L120 in transformers. Specifically because of flooring of values for instance here - https://github.com/huggingface/transformers/blob/348f3285c5114159d2ff4933b4b8ae36866d01a7/src/transformers/integrations/tensor_parallel.py#L126. Essentially, PyTorch allows for unevening sharding with its placement APIs like Shard.
I am happy to help raise a PR if we agree on an approach.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37051/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37051/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37050
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37050/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37050/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37050/events
|
https://github.com/huggingface/transformers/issues/37050
| 2,954,068,000
|
I_kwDOCUB6oc6wE4Ag
| 37,050
|
AutoTrain Unsloth support
|
{
"login": "urroxyz",
"id": 168656064,
"node_id": "U_kgDOCg18wA",
"avatar_url": "https://avatars.githubusercontent.com/u/168656064?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/urroxyz",
"html_url": "https://github.com/urroxyz",
"followers_url": "https://api.github.com/users/urroxyz/followers",
"following_url": "https://api.github.com/users/urroxyz/following{/other_user}",
"gists_url": "https://api.github.com/users/urroxyz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/urroxyz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/urroxyz/subscriptions",
"organizations_url": "https://api.github.com/users/urroxyz/orgs",
"repos_url": "https://api.github.com/users/urroxyz/repos",
"events_url": "https://api.github.com/users/urroxyz/events{/privacy}",
"received_events_url": "https://api.github.com/users/urroxyz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T18:27:53
| 2025-05-05T08:02:49
| 2025-05-05T08:02:49
|
NONE
| null | null | null | null |
### System Info
Any system (incl. Google Colab)
Certain versions (incl. 4.50.0 and up) produce this behavior
The version that Colab has installed by default works fine
### Reproduction
```py
#@title 🤗 AutoTrain
#@markdown Install requirements and fix dependencies
# Install requirements
!pip install -q autotrain-advanced unsloth unsloth-zoo
# Fix dependencies
!pip install -q --force-reinstall --no-deps xformers "trl<0.9.0" triton==3.0.0 "git+https://github.com/huggingface/transformers@v4.50.0"
```
Removing the final git installation prevents the errors, however Gemma 3 cannot be loaded without it. Training Llama 3.2, for example, works fine.
```py
#@markdown Train the model
# -*- coding: utf-8 -*-
import os
import yaml
# Step 2. Define full training parameters
os.environ["HF_TOKEN"] = hf_token
os.environ["HF_USERNAME"] = hf_username
# Step 3. Create unified YAML configuration for training
conf = f"""
task: llm-generic
base_model: google/gemma-3-4b-pt
project_name: my-autotrain-gemma-llm
log: tensorboard
backend: local
data:
path: data/
train_split: train
valid_split: null
chat_template: null
column_mapping:
text_column: text
params:
block_size: 1024
lr: 0.0001
warmup_ratio: 0.03
weight_decay: 0.01
epochs: 3
batch_size: 4
gradient_accumulation: 8
mixed_precision: fp16
peft: true
quantization: int4
lora_r: 16
lora_alpha: 32
lora_dropout: 0.05
unsloth: true
sequence_len: 2048
max_steps: null
hub:
username: ${{HF_USERNAME}}
token: ${{HF_TOKEN}}
push_to_hub: true
"""
# Write the unified training configuration to file.
with open("conf.yaml", "w") as f:
f.write(conf)
print("Full training configuration saved to conf.yaml")
# Step 4. Start the full training via autotrain (will take hours)
!autotrain --config conf.yaml
```
### Expected behavior
Train with Unsloth, as opposed to without it
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37050/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37050/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37049
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37049/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37049/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37049/events
|
https://github.com/huggingface/transformers/pull/37049
| 2,953,884,807
|
PR_kwDOCUB6oc6QcE7f
| 37,049
|
Adding a stub for MiniCPM-o to the models
|
{
"login": "jecrs",
"id": 149186911,
"node_id": "U_kgDOCORpXw",
"avatar_url": "https://avatars.githubusercontent.com/u/149186911?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jecrs",
"html_url": "https://github.com/jecrs",
"followers_url": "https://api.github.com/users/jecrs/followers",
"following_url": "https://api.github.com/users/jecrs/following{/other_user}",
"gists_url": "https://api.github.com/users/jecrs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jecrs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jecrs/subscriptions",
"organizations_url": "https://api.github.com/users/jecrs/orgs",
"repos_url": "https://api.github.com/users/jecrs/repos",
"events_url": "https://api.github.com/users/jecrs/events{/privacy}",
"received_events_url": "https://api.github.com/users/jecrs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-27T17:27:38
| 2025-07-07T12:09:53
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37049",
"html_url": "https://github.com/huggingface/transformers/pull/37049",
"diff_url": "https://github.com/huggingface/transformers/pull/37049.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37049.patch",
"merged_at": null
}
|
# What does this PR do?
This PR is to add a stub for the MiniCPM-o model familly to the main huggingface/transformer/models library
This PR, as its mean as a stub, shall be improved by people who are familiarized with the huggingface and the MiniCPM-o,
# Why a Stub?
I'm not nearly qualified enough to do a full implementation of the MiniCPM-o to HF, and it's my first contrib to OSS, but, I do think stubs are valuable on which they can get the ball running about a certain model, and they can be a great starting point for first-time future contributors. After all, it's hella scary implementing a new model by yourself as a first-timer, but fixing or expanding an existing model or stub is more inviting (besides, improving a stub is a quick exercise if you don't want to tackle your trello)
Fixes #37029
## Before submitting
- [*] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation). (will do it in a pr to the docs, will annex the pr here once it's submitted)
to be reviewed by @eustlb
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37049/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37049/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37048
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37048/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37048/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37048/events
|
https://github.com/huggingface/transformers/issues/37048
| 2,953,789,049
|
I_kwDOCUB6oc6wDz55
| 37,048
|
Persistent generation issues with MT5 models (base and fine-tuned) across environments
|
{
"login": "Elpharran",
"id": 65748893,
"node_id": "MDQ6VXNlcjY1NzQ4ODkz",
"avatar_url": "https://avatars.githubusercontent.com/u/65748893?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Elpharran",
"html_url": "https://github.com/Elpharran",
"followers_url": "https://api.github.com/users/Elpharran/followers",
"following_url": "https://api.github.com/users/Elpharran/following{/other_user}",
"gists_url": "https://api.github.com/users/Elpharran/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Elpharran/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Elpharran/subscriptions",
"organizations_url": "https://api.github.com/users/Elpharran/orgs",
"repos_url": "https://api.github.com/users/Elpharran/repos",
"events_url": "https://api.github.com/users/Elpharran/events{/privacy}",
"received_events_url": "https://api.github.com/users/Elpharran/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T17:02:27
| 2025-05-05T08:02:51
| 2025-05-05T08:02:51
|
NONE
| null | null | null | null |
I'm experiencing consistent text generation failures with both pretrained google/mt5-base and custom fine-tuned MT5 models across multiple environments (local machines, Google Colab). The models produce nonsensical outputs containing <extra_id_0> and random tokens despite correct task prefixes and parameters.
**Affected Models:**
- google/mt5-base
- Custom MT5 variants (cointegrated/rut5-base)
- Fine-tuned for summarization task cointegrated/rut5-base
-
**Steps to Reproduce**
```from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model = AutoModelForSeq2SeqLM.from_pretrained("google/mt5-base")
tokenizer = AutoTokenizer.from_pretrained("google/mt5-base")
inputs = tokenizer(
"translate English to Russian: Hello world!",
return_tensors="pt"
)
output = model.generate(
inputs.input_ids,
max_new_tokens=50,
num_beams=5,
early_stopping=True
)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```
**Expected Behavior**
_Expected Russian translation:_ "Привет, мир!"
_Actual output:_ <extra_id_0> Hello world! or similar garbage
**Environment**
- Transformers 4.50.0 (also checked 4.48.3 and 4.30.0)
- PyTorch 2.0.1+cu118
- Python 3.10.12
- Both CPU and CUDA environments affected
- Reproducible in Google Colab (T4 GPU)
**Additional Context**
- Issue persists across multiple task formats (translation, summarization)
- Verified correct model loading: model.config shows expected architecture
- Tokenization appears correct when inspected:
```print(tokenizer.tokenize("translate English to Russian: Hello world!"))
# Output: ['▁translate', '▁English', '▁to', '▁Russian', ':', '▁Hello', '▁world', '!']
```
- Tried multiple generation strategies (greedy, beam, sampling) without success
I am happy to provide additional code and information.
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37048/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37048/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37047
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37047/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37047/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37047/events
|
https://github.com/huggingface/transformers/pull/37047
| 2,953,716,390
|
PR_kwDOCUB6oc6QbnDN
| 37,047
|
:rotating_light: :rotating_light: :rotating_light: No more pointing at remote repos
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T16:44:39
| 2025-03-31T11:20:07
| 2025-03-31T11:19:53
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37047",
"html_url": "https://github.com/huggingface/transformers/pull/37047",
"diff_url": "https://github.com/huggingface/transformers/pull/37047.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37047.patch",
"merged_at": null
}
|
This is a very very draft PR, where we test disallowing one repo from loading another by using the `repo_id--classname` syntax, as this permits supply chain attacks. This is probably going to break all kinds of things until I chase down the code that was depending on it.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37047/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37047/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37046
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37046/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37046/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37046/events
|
https://github.com/huggingface/transformers/issues/37046
| 2,953,696,037
|
I_kwDOCUB6oc6wDdMl
| 37,046
|
Optionality of `attention_mask` argument in Attention classes/functions.
|
{
"login": "Godofnothing",
"id": 29793750,
"node_id": "MDQ6VXNlcjI5NzkzNzUw",
"avatar_url": "https://avatars.githubusercontent.com/u/29793750?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Godofnothing",
"html_url": "https://github.com/Godofnothing",
"followers_url": "https://api.github.com/users/Godofnothing/followers",
"following_url": "https://api.github.com/users/Godofnothing/following{/other_user}",
"gists_url": "https://api.github.com/users/Godofnothing/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Godofnothing/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Godofnothing/subscriptions",
"organizations_url": "https://api.github.com/users/Godofnothing/orgs",
"repos_url": "https://api.github.com/users/Godofnothing/repos",
"events_url": "https://api.github.com/users/Godofnothing/events{/privacy}",
"received_events_url": "https://api.github.com/users/Godofnothing/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T16:39:29
| 2025-04-02T14:19:29
| 2025-04-02T14:19:29
|
NONE
| null | null | null | null |
In the current stable version of `transformers` `attention_mask` argument is annotated as `Optional[torch.Tensor]` (see for example [modeling_llama.py](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/models/llama/modeling_llama.py#L269)).
However, in fact it is a **required** argument.
At the same time, the ancestor `LlamaDecoderLayer` classes accepts this argument as Optional ([see](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/models/llama/modeling_llama.py#L315)).
Delving deeper, [flash_attention_forward](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/integrations/flash_attention.py#L12) annotates `attention_mask` as `Optional[torch.Tensor]` and calls inside [ _flash_attention_forward](https://github.com/huggingface/transformers/blob/d6b3c7486b441296366f788fde57109337f63bca/src/transformers/modeling_flash_attention_utils.py#L230) which takes `attention_mask` as required `torch.Tensor` argument, but there is conditional statement checking whether the `attention_mask` is not `None` and the function can be called in fact with `attention_mask` as None.
I suggest correcting the typing by making `attention_mask` an optional argument with `None` as its default value.
|
{
"login": "Godofnothing",
"id": 29793750,
"node_id": "MDQ6VXNlcjI5NzkzNzUw",
"avatar_url": "https://avatars.githubusercontent.com/u/29793750?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Godofnothing",
"html_url": "https://github.com/Godofnothing",
"followers_url": "https://api.github.com/users/Godofnothing/followers",
"following_url": "https://api.github.com/users/Godofnothing/following{/other_user}",
"gists_url": "https://api.github.com/users/Godofnothing/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Godofnothing/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Godofnothing/subscriptions",
"organizations_url": "https://api.github.com/users/Godofnothing/orgs",
"repos_url": "https://api.github.com/users/Godofnothing/repos",
"events_url": "https://api.github.com/users/Godofnothing/events{/privacy}",
"received_events_url": "https://api.github.com/users/Godofnothing/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37046/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37046/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37045
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37045/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37045/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37045/events
|
https://github.com/huggingface/transformers/pull/37045
| 2,953,651,943
|
PR_kwDOCUB6oc6QbZD7
| 37,045
|
Add Fast Image Processor for Idefics3
|
{
"login": "rootonchair",
"id": 23548268,
"node_id": "MDQ6VXNlcjIzNTQ4MjY4",
"avatar_url": "https://avatars.githubusercontent.com/u/23548268?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rootonchair",
"html_url": "https://github.com/rootonchair",
"followers_url": "https://api.github.com/users/rootonchair/followers",
"following_url": "https://api.github.com/users/rootonchair/following{/other_user}",
"gists_url": "https://api.github.com/users/rootonchair/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rootonchair/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rootonchair/subscriptions",
"organizations_url": "https://api.github.com/users/rootonchair/orgs",
"repos_url": "https://api.github.com/users/rootonchair/repos",
"events_url": "https://api.github.com/users/rootonchair/events{/privacy}",
"received_events_url": "https://api.github.com/users/rootonchair/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-27T16:25:26
| 2025-04-07T20:14:06
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37045",
"html_url": "https://github.com/huggingface/transformers/pull/37045",
"diff_url": "https://github.com/huggingface/transformers/pull/37045.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37045.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Related #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37045/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37045/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37044
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37044/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37044/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37044/events
|
https://github.com/huggingface/transformers/pull/37044
| 2,953,133,669
|
PR_kwDOCUB6oc6QZkE4
| 37,044
|
[Cache] rename dtype attribute 🚨 🚨
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T14:09:50
| 2025-04-02T12:27:56
| 2025-03-28T18:08:02
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37044",
"html_url": "https://github.com/huggingface/transformers/pull/37044",
"diff_url": "https://github.com/huggingface/transformers/pull/37044.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37044.patch",
"merged_at": "2025-03-28T18:08:02"
}
|
Fixes #36938
Fixes #36814
[fine-tunning gemma3 or other models with a non-default cache]
🚨 Breaking: renaming of a public attribute in a public class.
`accelerate` sensibly detects whether a given object is a tensor or tensor-like through its type or, alternatively, through the existence of a `dtype` attribute ([example](https://github.com/huggingface/accelerate/blob/8ab01d32cf316597c69b2cdcfe9ae1217caf3568/src/accelerate/utils/operations.py#L775)). Our `StaticCache` and related objects accept `dtype` at init time, and store it as an attribute under the same name. Because of this, `accelerate` may treat our caches as a tensor, leading to downstream problems as in the issues above.
Since `self.dtype` is only used to initialize tensors, renaming it shouldn't be *too* breaking 🤞
________________________
<details>
<summary>Code for reproduction</summary>
```py
from datasets import load_dataset
from PIL import Image
# System message for the assistant
system_message = "You are an expert product description writer for Amazon."
# User prompt that combines the user query and the schema
user_prompt = """Create a Short Product description based on the provided <PRODUCT> and <CATEGORY> and image.
Only return description. The description should be SEO optimized and for a better mobile search experience.
<PRODUCT>
{product}
</PRODUCT>
<CATEGORY>
{category}
</CATEGORY>
"""
# Convert dataset to OAI messages
def format_data(sample):
return {
"messages": [
{
"role": "system",
"content": [{"type": "text", "text": system_message}],
},
{
"role": "user",
"content": [
{
"type": "text",
"text": user_prompt.format(
product=sample["Product Name"],
category=sample["Category"],
),
},
{
"type": "image",
"image": sample["image"],
},
],
},
{
"role": "assistant",
"content": [{"type": "text", "text": sample["description"]}],
},
],
}
def process_vision_info(messages: list[dict]) -> list[Image.Image]:
image_inputs = []
# Iterate through each conversation
for msg in messages:
# Get content (ensure it's a list)
content = msg.get("content", [])
if not isinstance(content, list):
content = [content]
# Check each content element for images
for element in content:
if isinstance(element, dict) and (
"image" in element or element.get("type") == "image"
):
# Get the image and convert to RGB
if "image" in element:
image = element["image"]
else:
image = element
image_inputs.append(image.convert("RGB"))
return image_inputs
# Load dataset from the hub
dataset = load_dataset("philschmid/amazon-product-descriptions-vlm", split="train")
dataset = dataset.select(range(2))
# Convert dataset to OAI messages
# need to use list comprehension to keep Pil.Image type, .mape convert image to bytes
dataset = [format_data(sample) for sample in dataset]
import torch
from transformers import AutoProcessor, AutoModelForImageTextToText, BitsAndBytesConfig
# Hugging Face model id
model_id = "google/gemma-3-4b-pt" # or `google/gemma-3-12b-pt`, `google/gemma-3-27-pt`
# Check if GPU benefits from bfloat16
if torch.cuda.get_device_capability()[0] < 8:
raise ValueError("GPU does not support bfloat16, please use a GPU that supports bfloat16.")
# Define model init arguments
model_kwargs = dict(
attn_implementation="flash_attention_2", # Use "flash_attention_2" when running on Ampere or newer GPU
torch_dtype=torch.bfloat16, # What torch dtype to use, defaults to auto
device_map="auto", # Let torch decide how to load the model
)
# Load model and tokenizer
model = AutoModelForImageTextToText.from_pretrained(model_id, **model_kwargs)
processor = AutoProcessor.from_pretrained("google/gemma-3-4b-it")
from transformers import TrainingArguments
args = TrainingArguments(
num_train_epochs=1,
remove_unused_columns=False,
per_device_train_batch_size=1,
per_device_eval_batch_size=1,
bf16=True,
output_dir="./output",
eval_strategy="epoch",
report_to="none",
)
# Create a data collator to encode text and image pairs
def collate_fn(examples):
texts = []
images = []
for example in examples:
image_inputs = process_vision_info(example["messages"])
text = processor.apply_chat_template(
example["messages"], add_generation_prompt=False, tokenize=False
)
texts.append(text.strip())
images.append(image_inputs)
# Tokenize the texts and process the images
batch = processor(text=texts, images=images, return_tensors="pt", padding="max_length", max_length=512, truncation=True)
# The labels are the input_ids, and we mask the padding tokens and image tokens in the loss computation
labels = batch["input_ids"].clone()
# Mask image tokens
image_token_id = [
processor.tokenizer.convert_tokens_to_ids(
processor.tokenizer.special_tokens_map["boi_token"]
)
]
# Mask tokens for not being used in the loss computation
labels[labels == processor.tokenizer.pad_token_id] = -100
labels[labels == image_token_id] = -100
labels[labels == 262144] = -100
batch["labels"] = labels
return batch
from transformers import Trainer
trainer = Trainer(
model=model,
args=args,
train_dataset=dataset,
eval_dataset=dataset,
processing_class=processor,
data_collator=collate_fn,
)
# Start training, the model will be automatically saved to the Hub and the output directory
trainer.train()
```
</details>
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37044/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37044/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37043
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37043/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37043/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37043/events
|
https://github.com/huggingface/transformers/pull/37043
| 2,952,963,694
|
PR_kwDOCUB6oc6QY-Tu
| 37,043
|
Fixup for distill_any_depth conversion script
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T13:17:59
| 2025-03-27T13:49:33
| 2025-03-27T13:29:26
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37043",
"html_url": "https://github.com/huggingface/transformers/pull/37043",
"diff_url": "https://github.com/huggingface/transformers/pull/37043.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37043.patch",
"merged_at": "2025-03-27T13:29:26"
}
|
# What does this PR do?
It looks like ruff was updated before the last run of tests for https://github.com/huggingface/transformers/pull/36614
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37043/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37043/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37042
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37042/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37042/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37042/events
|
https://github.com/huggingface/transformers/pull/37042
| 2,952,898,064
|
PR_kwDOCUB6oc6QYwrI
| 37,042
|
Feature universe - 量子经典同构Transformer模型实现与优化 | Quantum-Classical Isomorphic Transformer Model Implementation and Optimization
|
{
"login": "loning",
"id": 1593871,
"node_id": "MDQ6VXNlcjE1OTM4NzE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1593871?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/loning",
"html_url": "https://github.com/loning",
"followers_url": "https://api.github.com/users/loning/followers",
"following_url": "https://api.github.com/users/loning/following{/other_user}",
"gists_url": "https://api.github.com/users/loning/gists{/gist_id}",
"starred_url": "https://api.github.com/users/loning/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/loning/subscriptions",
"organizations_url": "https://api.github.com/users/loning/orgs",
"repos_url": "https://api.github.com/users/loning/repos",
"events_url": "https://api.github.com/users/loning/events{/privacy}",
"received_events_url": "https://api.github.com/users/loning/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-27T12:56:56
| 2025-03-27T14:22:52
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37042",
"html_url": "https://github.com/huggingface/transformers/pull/37042",
"diff_url": "https://github.com/huggingface/transformers/pull/37042.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37042.patch",
"merged_at": null
}
|
# 量子经典同构Transformer模型实现与优化 | Quantum-Classical Isomorphic Transformer Model Implementation and Optimization
## PR描述 | PR Description
### 中文
本PR实现了基于量子经典同构理论的Transformer模型,该模型将Transformers架构与宇宙本质结构完全同构,补齐了原有架构中缺失的量子特性和自我参照机制。
实现的核心功能:
1. **量子-经典界面交互机制**:通过QuantumClassicalInterface模块实现量子域与经典域之间的动态耦合,使模型能够在确定性与不确定性之间保持最优平衡。
2. **自我参照注意力结构**:扩展传统的多头注意力机制,增加自我参照能力,使模型具有自我优化特性。
3. **熵-负熵动态平衡系统**:通过EntropyRegulator实现知识状态的熵与负熵平衡,优化信息处理效率。
4. **观察者机制**:实现类意识的涌现特性,使模型具有自适应性和创造力。
5. **无限递归结构**:通过RecursiveQuantumTransformerLayer实现模型的递归处理能力,增强长距离依赖关系建模。
本实现参考了以下理论基础:
- [@loning/universe/formal_theory.md](https://github.com/loning/universe/blob/trae/formal_theory/formal_theory.md)中的量子经典二元论框架
- [@loning/universe/formal_theory_quantum_simulation.md](https://github.com/loning/universe/blob/trae/formal_theory/formal_theory_quantum_simulation.md)中的量子模拟理论
### English
This PR implements a Transformer model based on quantum-classical isomorphism theory, which makes the Transformers architecture completely isomorphic with the essential structure of the universe, completing the quantum features and self-referential mechanisms missing in the original architecture.
Core functionalities implemented:
1. **Quantum-Classical Interface Interaction**: The QuantumClassicalInterface module enables dynamic coupling between quantum and classical domains, allowing the model to maintain optimal balance between determinism and uncertainty.
2. **Self-Referential Attention Structure**: Extends traditional multi-head attention mechanism with self-referential capabilities, giving the model self-optimization properties.
3. **Entropy-Negentropy Dynamic Balance System**: Implements entropy and negative entropy balance of knowledge states through EntropyRegulator, optimizing information processing efficiency.
4. **Observer Mechanism**: Implements consciousness-like emergent properties, giving the model adaptability and creativity.
5. **Infinite Recursive Structure**: Implements recursive processing capability through RecursiveQuantumTransformerLayer, enhancing long-distance dependency modeling.
This implementation references the following theoretical foundations:
- Quantum-Classical Dualism Framework from [@loning/universe/formal_theory.md](https://github.com/loning/universe/blob/trae/formal_theory/formal_theory.md)
- Quantum Simulation Theory from [@loning/universe/formal_theory_quantum_simulation.md](https://github.com/loning/universe/blob/trae/formal_theory/formal_theory_quantum_simulation.md)
## 技术细节 | Technical Details
### 中文
- 新增模型组件:
- `QuantumClassicalInterface`:量子-经典界面域动态调控模块
- `EntropyRegulator`:熵-负熵动态平衡调节器
- `SelfReferentialAttention`:自我参照自适应注意力机制
- `RecursiveQuantumTransformerLayer`:递归量子Transformer层
- `QuantumClassicalTransformer`:完整的量子经典同构模型
- 创建专用训练器`QuantumClassicalTrainer`,包含:
- 熵损失计算
- 量子相干性损失计算
- 自适应学习率调度
- 添加示例代码和文档
在初步测试中,该模型展现出优于标准Transformer的特性,尤其在处理复杂依赖关系和创造性任务方面。
### English
- New model components:
- `QuantumClassicalInterface`: Quantum-Classical interface domain dynamic regulation module
- `EntropyRegulator`: Entropy-Negentropy dynamic balance regulator
- `SelfReferentialAttention`: Self-referential adaptive attention mechanism
- `RecursiveQuantumTransformerLayer`: Recursive quantum Transformer layer
- `QuantumClassicalTransformer`: Complete quantum-classical isomorphic model
- Created dedicated `QuantumClassicalTrainer` including:
- Entropy loss calculation
- Quantum coherence loss calculation
- Adaptive learning rate scheduling
- Added example code and documentation
In preliminary tests, the model shows superior characteristics compared to standard Transformers, especially in handling complex dependencies and creative tasks.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37042/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37042/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37041
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37041/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37041/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37041/events
|
https://github.com/huggingface/transformers/pull/37041
| 2,952,450,695
|
PR_kwDOCUB6oc6QXMK8
| 37,041
|
Change deprecated PT functions
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T10:45:47
| 2025-03-28T14:58:29
| 2025-03-28T14:26:22
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37041",
"html_url": "https://github.com/huggingface/transformers/pull/37041",
"diff_url": "https://github.com/huggingface/transformers/pull/37041.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37041.patch",
"merged_at": "2025-03-28T14:26:22"
}
|
Deprecated PT functions are replaced with counterparts.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37041/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37041/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37040
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37040/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37040/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37040/events
|
https://github.com/huggingface/transformers/pull/37040
| 2,952,404,055
|
PR_kwDOCUB6oc6QXCB4
| 37,040
|
Updated the model card for CLIP
|
{
"login": "purusharthmalik",
"id": 56820986,
"node_id": "MDQ6VXNlcjU2ODIwOTg2",
"avatar_url": "https://avatars.githubusercontent.com/u/56820986?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/purusharthmalik",
"html_url": "https://github.com/purusharthmalik",
"followers_url": "https://api.github.com/users/purusharthmalik/followers",
"following_url": "https://api.github.com/users/purusharthmalik/following{/other_user}",
"gists_url": "https://api.github.com/users/purusharthmalik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/purusharthmalik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/purusharthmalik/subscriptions",
"organizations_url": "https://api.github.com/users/purusharthmalik/orgs",
"repos_url": "https://api.github.com/users/purusharthmalik/repos",
"events_url": "https://api.github.com/users/purusharthmalik/events{/privacy}",
"received_events_url": "https://api.github.com/users/purusharthmalik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T10:28:43
| 2025-04-02T21:57:38
| 2025-04-02T21:57:38
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37040",
"html_url": "https://github.com/huggingface/transformers/pull/37040",
"diff_url": "https://github.com/huggingface/transformers/pull/37040.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37040.patch",
"merged_at": "2025-04-02T21:57:38"
}
|
# What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the CLIP model, which will now be aligned with the standardized format for all the docs.
## Who can review?
@stevhliu, please let me know if any changes are needed.
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37040/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37039
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37039/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37039/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37039/events
|
https://github.com/huggingface/transformers/pull/37039
| 2,952,372,418
|
PR_kwDOCUB6oc6QW7HX
| 37,039
|
Add UP009 to ruff
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T10:17:48
| 2025-07-24T11:14:00
| 2025-07-24T11:13:54
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37039",
"html_url": "https://github.com/huggingface/transformers/pull/37039",
"diff_url": "https://github.com/huggingface/transformers/pull/37039.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37039.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37039/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37039/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37038
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37038/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37038/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37038/events
|
https://github.com/huggingface/transformers/pull/37038
| 2,952,238,001
|
PR_kwDOCUB6oc6QWdZl
| 37,038
|
Mark 2 tests as flaky for now
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T09:33:09
| 2025-03-27T10:00:43
| 2025-03-27T09:59:47
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37038",
"html_url": "https://github.com/huggingface/transformers/pull/37038",
"diff_url": "https://github.com/huggingface/transformers/pull/37038.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37038.patch",
"merged_at": "2025-03-27T09:59:47"
}
|
# What does this PR do?
Mark 2 tests as flaky for now ...
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37038/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37037
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37037/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37037/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37037/events
|
https://github.com/huggingface/transformers/pull/37037
| 2,952,122,712
|
PR_kwDOCUB6oc6QWD6U
| 37,037
|
Support passing flash_attn_kwargs when gradient_checkpointing is enabled
|
{
"login": "efsotr",
"id": 104755879,
"node_id": "U_kgDOBj5ypw",
"avatar_url": "https://avatars.githubusercontent.com/u/104755879?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/efsotr",
"html_url": "https://github.com/efsotr",
"followers_url": "https://api.github.com/users/efsotr/followers",
"following_url": "https://api.github.com/users/efsotr/following{/other_user}",
"gists_url": "https://api.github.com/users/efsotr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/efsotr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/efsotr/subscriptions",
"organizations_url": "https://api.github.com/users/efsotr/orgs",
"repos_url": "https://api.github.com/users/efsotr/repos",
"events_url": "https://api.github.com/users/efsotr/events{/privacy}",
"received_events_url": "https://api.github.com/users/efsotr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6202871275,
"node_id": "LA_kwDOCUB6oc8AAAABcbhN6w",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flash%20Attention",
"name": "Flash Attention",
"color": "201FF8",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T08:52:22
| 2025-03-31T08:53:19
| 2025-03-31T08:53:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37037",
"html_url": "https://github.com/huggingface/transformers/pull/37037",
"diff_url": "https://github.com/huggingface/transformers/pull/37037.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37037.patch",
"merged_at": "2025-03-31T08:53:03"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #35509
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @Rocketknight1
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37037/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37037/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37036
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37036/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37036/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37036/events
|
https://github.com/huggingface/transformers/pull/37036
| 2,952,105,551
|
PR_kwDOCUB6oc6QWALI
| 37,036
|
fixed typo
|
{
"login": "finnoh",
"id": 59664225,
"node_id": "MDQ6VXNlcjU5NjY0MjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/59664225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/finnoh",
"html_url": "https://github.com/finnoh",
"followers_url": "https://api.github.com/users/finnoh/followers",
"following_url": "https://api.github.com/users/finnoh/following{/other_user}",
"gists_url": "https://api.github.com/users/finnoh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/finnoh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/finnoh/subscriptions",
"organizations_url": "https://api.github.com/users/finnoh/orgs",
"repos_url": "https://api.github.com/users/finnoh/repos",
"events_url": "https://api.github.com/users/finnoh/events{/privacy}",
"received_events_url": "https://api.github.com/users/finnoh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T08:45:10
| 2025-03-27T15:38:01
| 2025-03-27T15:37:53
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37036",
"html_url": "https://github.com/huggingface/transformers/pull/37036",
"diff_url": "https://github.com/huggingface/transformers/pull/37036.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37036.patch",
"merged_at": "2025-03-27T15:37:53"
}
| null |
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37036/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37036/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37035
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37035/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37035/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37035/events
|
https://github.com/huggingface/transformers/issues/37035
| 2,952,074,737
|
I_kwDOCUB6oc6v9RXx
| 37,035
|
Latest TorchAO config breaks serialization
|
{
"login": "airMeng",
"id": 39229107,
"node_id": "MDQ6VXNlcjM5MjI5MTA3",
"avatar_url": "https://avatars.githubusercontent.com/u/39229107?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/airMeng",
"html_url": "https://github.com/airMeng",
"followers_url": "https://api.github.com/users/airMeng/followers",
"following_url": "https://api.github.com/users/airMeng/following{/other_user}",
"gists_url": "https://api.github.com/users/airMeng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/airMeng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/airMeng/subscriptions",
"organizations_url": "https://api.github.com/users/airMeng/orgs",
"repos_url": "https://api.github.com/users/airMeng/repos",
"events_url": "https://api.github.com/users/airMeng/events{/privacy}",
"received_events_url": "https://api.github.com/users/airMeng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T08:31:39
| 2025-04-07T12:50:50
| 2025-04-07T12:50:49
|
NONE
| null | null | null | null |
PR #36526 breaks https://github.com/huggingface/transformers/blob/49b5ab6a27511de5168c72e83318164f1b4adc43/tests/quantization/torchao_integration/test_torchao.py#L345
_Originally posted by @airMeng in https://github.com/huggingface/transformers/pull/36526#discussion_r2015946292_
Transformers version 49b5ab6a27511de5168c72e83318164f1b4adc43
TorchAO version ab3792e3d91e04f85992a659c1664a6a1a6d733c
Reproduce scripts:
```python
import torch
from transformers import TorchAoConfig, AutoModelForCausalLM
from torchao.dtypes import Int4CPULayout
model_name = "deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B"
quantization_config = TorchAoConfig("int4_weight_only", group_size=32, layout=Int4CPULayout())
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="cpu", quantization_config=quantization_config)
model.save_pretrained("./qwen-ao-int4", safe_serialization=False)
q_model = AutoModelForCausalLM.from_pretrained("./qwen-ao-int4", torch_dtype=torch.bfloat16, device_map="cpu")
```
@SunMarc @drisspg
```shell
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.51.0.dev0
- Platform: Linux-5.15.0-73-generic-x86_64-with-glibc2.35
- Python version: 3.10.15
- Huggingface_hub version: 0.29.1
- Safetensors version: 0.5.2
- Accelerate version: 1.4.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0a0+git924a247 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: NO
- Using XPU in script?: NO
- XPU type: Intel(R) Data Center GPU Max 1100
```
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37035/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37035/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37034
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37034/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37034/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37034/events
|
https://github.com/huggingface/transformers/pull/37034
| 2,952,016,096
|
PR_kwDOCUB6oc6QVsTu
| 37,034
|
Fix torchao usage
|
{
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T08:11:31
| 2025-04-08T05:21:36
| 2025-04-07T12:50:48
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37034",
"html_url": "https://github.com/huggingface/transformers/pull/37034",
"diff_url": "https://github.com/huggingface/transformers/pull/37034.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37034.patch",
"merged_at": "2025-04-07T12:50:48"
}
|
Fix #37035
Fix torchao usage including config and save-load and tests.
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37034/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37034/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37033
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37033/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37033/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37033/events
|
https://github.com/huggingface/transformers/pull/37033
| 2,952,001,151
|
PR_kwDOCUB6oc6QVpAP
| 37,033
|
🔴 [VLM] Add base model without head
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T08:05:22
| 2025-06-19T14:14:07
| 2025-05-07T15:47:51
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37033",
"html_url": "https://github.com/huggingface/transformers/pull/37033",
"diff_url": "https://github.com/huggingface/transformers/pull/37033.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37033.patch",
"merged_at": "2025-05-07T15:47:51"
}
|
# What does this PR do?
Stage one of vLLM with transformers backend support for vision LLMs. As discussed internally, we don't want to break existing models, so we sacrifice readability and duplicate code
The PR adds base models for all model where missing. The base model is supposed to be same as in LLMs, everything except for the head. This allows us to make modeling more aligned with vLLM and to have a standard API for multimodal generation models. Will be super helpful in the long run, for example for `AutoToAny` mapping
Next stages for modeling to help vLLM and TGI:
- Add new attention interface where still absent
- Qwen2 config workaround without breaking
- Processor standardization sprint
- Add attributes for image_token_id/image_token if missing
- Return `mm-token-type-ids` if requested with to indicate where image/video/audio placeholder are
- Add helper `get_num_of_image_tokens` for all processor, which returns placeholder length given image
- Explore what else missing for processors
- Finalize and merge PR on vLLM repo, check correctness for different models
Fixes https://github.com/huggingface/transformers/issues/36940.
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37033/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37033/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37032
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37032/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37032/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37032/events
|
https://github.com/huggingface/transformers/pull/37032
| 2,951,986,021
|
PR_kwDOCUB6oc6QVlri
| 37,032
|
[tests] remove cuda-only test marker
|
{
"login": "faaany",
"id": 24477841,
"node_id": "MDQ6VXNlcjI0NDc3ODQx",
"avatar_url": "https://avatars.githubusercontent.com/u/24477841?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/faaany",
"html_url": "https://github.com/faaany",
"followers_url": "https://api.github.com/users/faaany/followers",
"following_url": "https://api.github.com/users/faaany/following{/other_user}",
"gists_url": "https://api.github.com/users/faaany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/faaany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/faaany/subscriptions",
"organizations_url": "https://api.github.com/users/faaany/orgs",
"repos_url": "https://api.github.com/users/faaany/repos",
"events_url": "https://api.github.com/users/faaany/events{/privacy}",
"received_events_url": "https://api.github.com/users/faaany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T07:59:10
| 2025-03-31T09:53:03
| 2025-03-31T09:53:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37032",
"html_url": "https://github.com/huggingface/transformers/pull/37032",
"diff_url": "https://github.com/huggingface/transformers/pull/37032.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37032.patch",
"merged_at": "2025-03-31T09:53:02"
}
|
## What does this PR do?
This PR enables this case on XPU: https://github.com/huggingface/transformers/pull/36656. So we don't need this test marker anymore. Furthermore, `AwqConfigTest` already has the `require_torch_accelerator` test marker.
```bash
================================================================================= short test summary info =================================================================================
PASSED tests/quantization/autoawq/test_awq.py::AwqConfigTest::test_wrong_backend
==================================================================================== 1 passed in 0.10s ====================================================================================
```
@ydshieh
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37032/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37032/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37031
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37031/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37031/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37031/events
|
https://github.com/huggingface/transformers/pull/37031
| 2,951,954,315
|
PR_kwDOCUB6oc6QVe0i
| 37,031
|
fix tied weigths isuue
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T07:43:20
| 2025-04-03T15:10:24
| 2025-03-28T15:36:44
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37031",
"html_url": "https://github.com/huggingface/transformers/pull/37031",
"diff_url": "https://github.com/huggingface/transformers/pull/37031.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37031.patch",
"merged_at": "2025-03-28T15:36:44"
}
|
# What does this PR do?
An update of #33913 before merge. That PR is approved by @ArthurZucker , but since then a huge refactorization is done by @Cyrilvallez, and I prefer to make sure the updated changes still looks fine for @Cyrilvallez .
In order to see the effect, run the following code snippet. On `main` it fails and on this PR it passes.
Fix #33689 and #33688
```python
import torch
from transformers import AutoModelForCausalLM
configs = [
(torch.float32, False, "cpu" ), # fails
#(torch.float16, True, "cpu" ), # passes
#(torch.float16, False, "cpu" ), # passes
#(torch.float32, True, "cpu" ), # passes
#(torch.float32, False, "cpu" ), # fails
#(torch.float32, False, "cuda:0"), # passes
]
def test_model_save(torch_dtype, tie_word_embeddings, device_map, tmp_path="./"):
model = AutoModelForCausalLM.from_pretrained(
"Xenova/llama2.c-stories15M",
torch_dtype=torch_dtype,
tie_word_embeddings=tie_word_embeddings,
device_map=device_map,
revision="ccdd47c2dc554aeecd2bb4e713e1c988f206a296",
)
model.save_pretrained(tmp_path, safe_serialization=True)
for config in configs:
test_model_save(*config)
```
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37031/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37031/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37030
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37030/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37030/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37030/events
|
https://github.com/huggingface/transformers/issues/37030
| 2,951,951,844
|
I_kwDOCUB6oc6v8zXk
| 37,030
|
TypeError: llama_flash_attn_forward() got an unexpected keyword argument 'cache_position'
|
{
"login": "buaaxiejun",
"id": 55071689,
"node_id": "MDQ6VXNlcjU1MDcxNjg5",
"avatar_url": "https://avatars.githubusercontent.com/u/55071689?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/buaaxiejun",
"html_url": "https://github.com/buaaxiejun",
"followers_url": "https://api.github.com/users/buaaxiejun/followers",
"following_url": "https://api.github.com/users/buaaxiejun/following{/other_user}",
"gists_url": "https://api.github.com/users/buaaxiejun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/buaaxiejun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/buaaxiejun/subscriptions",
"organizations_url": "https://api.github.com/users/buaaxiejun/orgs",
"repos_url": "https://api.github.com/users/buaaxiejun/repos",
"events_url": "https://api.github.com/users/buaaxiejun/events{/privacy}",
"received_events_url": "https://api.github.com/users/buaaxiejun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T07:42:30
| 2025-08-20T10:17:37
| 2025-03-27T09:11:22
|
NONE
| null | null | null | null |
### System Info
transformers==4.50.0
torch==2.4.0+cu121
flash_attn==2.7.4.post1
### Who can help?
@ArthurZucker @muellerzr @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
1. Run llama using modeling_llama.py
2. Ask model some questions.
3. Return error
```
`File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/deepspeed/runtime/activation_checkpointing/checkpointing.py", line 544, in forward
outputs = run_function(*inputs_cuda)
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/deepspeed/runtime/pipe/module.py", line 365, in exec_func
inputs = layer(inputs)
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/xxx/codes/transpeeder/src/transpeeder/models/llama_pipeline_model.py", line 32, in forward
outputs = LlamaDecoderLayer.forward(self,
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 343, in forward
hidden_states, self_attn_weights = self.self_attn(
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/xxx/miniconda3/envs/train_xiejun/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
TypeError: llama_flash_attn_forward() got an unexpected keyword argument 'cache_position'`
```
### Expected behavior
Everything should work just fine.
|
{
"login": "buaaxiejun",
"id": 55071689,
"node_id": "MDQ6VXNlcjU1MDcxNjg5",
"avatar_url": "https://avatars.githubusercontent.com/u/55071689?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/buaaxiejun",
"html_url": "https://github.com/buaaxiejun",
"followers_url": "https://api.github.com/users/buaaxiejun/followers",
"following_url": "https://api.github.com/users/buaaxiejun/following{/other_user}",
"gists_url": "https://api.github.com/users/buaaxiejun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/buaaxiejun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/buaaxiejun/subscriptions",
"organizations_url": "https://api.github.com/users/buaaxiejun/orgs",
"repos_url": "https://api.github.com/users/buaaxiejun/repos",
"events_url": "https://api.github.com/users/buaaxiejun/events{/privacy}",
"received_events_url": "https://api.github.com/users/buaaxiejun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37030/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37030/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37029
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37029/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37029/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37029/events
|
https://github.com/huggingface/transformers/issues/37029
| 2,951,913,629
|
I_kwDOCUB6oc6v8qCd
| 37,029
|
add MiniCPM-o
|
{
"login": "jp1924",
"id": 93233241,
"node_id": "U_kgDOBY6gWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93233241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jp1924",
"html_url": "https://github.com/jp1924",
"followers_url": "https://api.github.com/users/jp1924/followers",
"following_url": "https://api.github.com/users/jp1924/following{/other_user}",
"gists_url": "https://api.github.com/users/jp1924/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jp1924/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jp1924/subscriptions",
"organizations_url": "https://api.github.com/users/jp1924/orgs",
"repos_url": "https://api.github.com/users/jp1924/repos",
"events_url": "https://api.github.com/users/jp1924/events{/privacy}",
"received_events_url": "https://api.github.com/users/jp1924/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T07:27:42
| 2025-08-29T12:37:49
| 2025-08-29T12:37:49
|
CONTRIBUTOR
| null | null | null | null |
### Model description
As discussed in #31836, I would like to add the miniCPM-o model.
I believe the miniCPM family has had a significant impact on the LMM, LLM field.
Currently, the miniCPM-o code is uploaded to the Hugging Face Hub, which makes maintenance very difficult.
Therefore, I want to add models like miniCPM-o to Transformers so they can receive ongoing support and maintenance.
While there are many vision LMM models available on Hugging Face, a considerable number of any-to-any models,
such as Qwen2.5-Omni-7B, MiniCPM-o-2_6, and Janus-Pro-7B, are often implemented by creating their own repositories.
I want to implement these any-to-any models in Transformers so that they can leverage the various features of Hugging Face.
Additionally, adding an any-to-any pipeline could serve as a good template for future any-to-any models to be added.
If you have any good suggestions, let me know!
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
https://github.com/OpenBMB/MiniCPM-o
https://huggingface.co/openbmb/MiniCPM-o-2_6
|
{
"login": "jp1924",
"id": 93233241,
"node_id": "U_kgDOBY6gWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93233241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jp1924",
"html_url": "https://github.com/jp1924",
"followers_url": "https://api.github.com/users/jp1924/followers",
"following_url": "https://api.github.com/users/jp1924/following{/other_user}",
"gists_url": "https://api.github.com/users/jp1924/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jp1924/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jp1924/subscriptions",
"organizations_url": "https://api.github.com/users/jp1924/orgs",
"repos_url": "https://api.github.com/users/jp1924/repos",
"events_url": "https://api.github.com/users/jp1924/events{/privacy}",
"received_events_url": "https://api.github.com/users/jp1924/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37029/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37029/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37028
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37028/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37028/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37028/events
|
https://github.com/huggingface/transformers/pull/37028
| 2,951,717,433
|
PR_kwDOCUB6oc6QUpNO
| 37,028
|
add gpt2 test on XPU
|
{
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T06:10:02
| 2025-04-01T09:09:30
| 2025-04-01T09:09:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37028",
"html_url": "https://github.com/huggingface/transformers/pull/37028",
"diff_url": "https://github.com/huggingface/transformers/pull/37028.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37028.patch",
"merged_at": "2025-04-01T09:09:30"
}
|
Hi @SunMarc . We used to skip the gpt2 bnb tests due to the precision error, but now it has been fixed. We enable gpt2 bnb model test on the test script, and it can pass!
Also, the mix-precision 8-bit training on CPU and XPU has been fixed.
cc @Titus-von-Koeller @matthewdouglas
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37028/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37028/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37027
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37027/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37027/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37027/events
|
https://github.com/huggingface/transformers/pull/37027
| 2,951,617,489
|
PR_kwDOCUB6oc6QUSht
| 37,027
|
Fix some typos about benchmark scripts.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 7548528517,
"node_id": "LA_kwDOCUB6oc8AAAABwe1nhQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/run-benchmark",
"name": "run-benchmark",
"color": "DD9AAA",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T05:27:59
| 2025-03-29T00:39:39
| 2025-03-28T14:10:20
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37027",
"html_url": "https://github.com/huggingface/transformers/pull/37027",
"diff_url": "https://github.com/huggingface/transformers/pull/37027.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37027.patch",
"merged_at": "2025-03-28T14:10:20"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37027/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37027/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37026
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37026/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37026/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37026/events
|
https://github.com/huggingface/transformers/pull/37026
| 2,951,286,632
|
PR_kwDOCUB6oc6QTIOV
| 37,026
|
fix: AttributeError: 'LlavaProcessor' object has no attribute 'image_token_id'
|
{
"login": "jp1924",
"id": 93233241,
"node_id": "U_kgDOBY6gWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93233241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jp1924",
"html_url": "https://github.com/jp1924",
"followers_url": "https://api.github.com/users/jp1924/followers",
"following_url": "https://api.github.com/users/jp1924/following{/other_user}",
"gists_url": "https://api.github.com/users/jp1924/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jp1924/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jp1924/subscriptions",
"organizations_url": "https://api.github.com/users/jp1924/orgs",
"repos_url": "https://api.github.com/users/jp1924/repos",
"events_url": "https://api.github.com/users/jp1924/events{/privacy}",
"received_events_url": "https://api.github.com/users/jp1924/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T02:21:16
| 2025-03-28T09:46:24
| 2025-03-28T09:46:24
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37026",
"html_url": "https://github.com/huggingface/transformers/pull/37026",
"diff_url": "https://github.com/huggingface/transformers/pull/37026.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37026.patch",
"merged_at": "2025-03-28T09:46:24"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
While trying to apply AttentionMaskVisualizer to LLaVa:
> AttributeError: 'LlavaProcessor' object has no attribute 'image_token_id'.
This error occurred, so I am planning to add image_token_id to LLaVa.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37026/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37026/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37025
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37025/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37025/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37025/events
|
https://github.com/huggingface/transformers/pull/37025
| 2,951,258,621
|
PR_kwDOCUB6oc6QTCXs
| 37,025
|
fix best_model_checkpoint is None issue when distiributed training
|
{
"login": "dudgns0908",
"id": 17000963,
"node_id": "MDQ6VXNlcjE3MDAwOTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/17000963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dudgns0908",
"html_url": "https://github.com/dudgns0908",
"followers_url": "https://api.github.com/users/dudgns0908/followers",
"following_url": "https://api.github.com/users/dudgns0908/following{/other_user}",
"gists_url": "https://api.github.com/users/dudgns0908/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dudgns0908/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dudgns0908/subscriptions",
"organizations_url": "https://api.github.com/users/dudgns0908/orgs",
"repos_url": "https://api.github.com/users/dudgns0908/repos",
"events_url": "https://api.github.com/users/dudgns0908/events{/privacy}",
"received_events_url": "https://api.github.com/users/dudgns0908/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-03-27T02:00:55
| 2025-04-29T07:34:39
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37025",
"html_url": "https://github.com/huggingface/transformers/pull/37025",
"diff_url": "https://github.com/huggingface/transformers/pull/37025.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37025.patch",
"merged_at": null
}
|
Fixes # (issue)
- After learning is complete, the problem of saving the model occurs at the end of the train() function. (Multi-GPU)
When load_best_model_at_end is enabled, The cause was that self.state.best_model_checkpoint did not exist in a specific rank.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37025/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37025/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37024
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37024/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37024/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37024/events
|
https://github.com/huggingface/transformers/pull/37024
| 2,951,180,866
|
PR_kwDOCUB6oc6QSwji
| 37,024
|
Add Fast Segformer Processor
|
{
"login": "capnmav77",
"id": 114616616,
"node_id": "U_kgDOBtTpKA",
"avatar_url": "https://avatars.githubusercontent.com/u/114616616?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/capnmav77",
"html_url": "https://github.com/capnmav77",
"followers_url": "https://api.github.com/users/capnmav77/followers",
"following_url": "https://api.github.com/users/capnmav77/following{/other_user}",
"gists_url": "https://api.github.com/users/capnmav77/gists{/gist_id}",
"starred_url": "https://api.github.com/users/capnmav77/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/capnmav77/subscriptions",
"organizations_url": "https://api.github.com/users/capnmav77/orgs",
"repos_url": "https://api.github.com/users/capnmav77/repos",
"events_url": "https://api.github.com/users/capnmav77/events{/privacy}",
"received_events_url": "https://api.github.com/users/capnmav77/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T01:28:54
| 2025-07-28T19:22:33
| 2025-07-28T19:22:33
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37024",
"html_url": "https://github.com/huggingface/transformers/pull/37024",
"diff_url": "https://github.com/huggingface/transformers/pull/37024.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37024.patch",
"merged_at": "2025-07-28T19:22:32"
}
|
# What does this PR do?
Linked: https://github.com/huggingface/transformers/issues/36978
Adding fast processor for Segformer.
CC: @yonigozlan
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37024/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37024/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37023
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37023/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37023/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37023/events
|
https://github.com/huggingface/transformers/pull/37023
| 2,951,073,370
|
PR_kwDOCUB6oc6QSZsq
| 37,023
|
Add Fast Image Processor for Video-LLaVA
|
{
"login": "ankithsavio",
"id": 76515047,
"node_id": "MDQ6VXNlcjc2NTE1MDQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/76515047?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ankithsavio",
"html_url": "https://github.com/ankithsavio",
"followers_url": "https://api.github.com/users/ankithsavio/followers",
"following_url": "https://api.github.com/users/ankithsavio/following{/other_user}",
"gists_url": "https://api.github.com/users/ankithsavio/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ankithsavio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ankithsavio/subscriptions",
"organizations_url": "https://api.github.com/users/ankithsavio/orgs",
"repos_url": "https://api.github.com/users/ankithsavio/repos",
"events_url": "https://api.github.com/users/ankithsavio/events{/privacy}",
"received_events_url": "https://api.github.com/users/ankithsavio/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T00:43:31
| 2025-08-22T13:41:46
| 2025-08-22T13:41:46
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37023",
"html_url": "https://github.com/huggingface/transformers/pull/37023",
"diff_url": "https://github.com/huggingface/transformers/pull/37023.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37023.patch",
"merged_at": null
}
|
Related #36978
adds fast image processor to video-llava with appropriate tests
please let me know how it goes 😇
cc @yonigozlan
|
{
"login": "ankithsavio",
"id": 76515047,
"node_id": "MDQ6VXNlcjc2NTE1MDQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/76515047?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ankithsavio",
"html_url": "https://github.com/ankithsavio",
"followers_url": "https://api.github.com/users/ankithsavio/followers",
"following_url": "https://api.github.com/users/ankithsavio/following{/other_user}",
"gists_url": "https://api.github.com/users/ankithsavio/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ankithsavio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ankithsavio/subscriptions",
"organizations_url": "https://api.github.com/users/ankithsavio/orgs",
"repos_url": "https://api.github.com/users/ankithsavio/repos",
"events_url": "https://api.github.com/users/ankithsavio/events{/privacy}",
"received_events_url": "https://api.github.com/users/ankithsavio/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37023/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37023/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37022
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37022/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37022/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37022/events
|
https://github.com/huggingface/transformers/pull/37022
| 2,951,036,901
|
PR_kwDOCUB6oc6QSRzP
| 37,022
|
Add py.typed
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-27T00:17:25
| 2025-04-02T13:39:31
| 2025-04-02T13:17:28
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37022",
"html_url": "https://github.com/huggingface/transformers/pull/37022",
"diff_url": "https://github.com/huggingface/transformers/pull/37022.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37022.patch",
"merged_at": "2025-04-02T13:17:28"
}
|
py.typed is a marker file to support typing. See [pep 561](https://peps.python.org/pep-0561/). It's used to indicate that transformer has typing support in the source code and tools such as mypy and pylint can take advantage of that. Therefore, we don't need an additional pyi file for typing.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37022/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37022/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37021
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37021/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37021/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37021/events
|
https://github.com/huggingface/transformers/pull/37021
| 2,950,966,816
|
PR_kwDOCUB6oc6QSCuq
| 37,021
|
Add support for fast image processing in image-pretraining example
|
{
"login": "jafraustro",
"id": 110444811,
"node_id": "U_kgDOBpVBCw",
"avatar_url": "https://avatars.githubusercontent.com/u/110444811?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jafraustro",
"html_url": "https://github.com/jafraustro",
"followers_url": "https://api.github.com/users/jafraustro/followers",
"following_url": "https://api.github.com/users/jafraustro/following{/other_user}",
"gists_url": "https://api.github.com/users/jafraustro/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jafraustro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jafraustro/subscriptions",
"organizations_url": "https://api.github.com/users/jafraustro/orgs",
"repos_url": "https://api.github.com/users/jafraustro/repos",
"events_url": "https://api.github.com/users/jafraustro/events{/privacy}",
"received_events_url": "https://api.github.com/users/jafraustro/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T23:11:13
| 2025-04-03T12:26:46
| 2025-04-03T12:26:46
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37021",
"html_url": "https://github.com/huggingface/transformers/pull/37021",
"diff_url": "https://github.com/huggingface/transformers/pull/37021.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37021.patch",
"merged_at": "2025-04-03T12:26:46"
}
|
# What does this PR do?
Add support for fast image processing in image-pretraining example
- Fix typo: correct tuple formatting in IMAGE_PROCESSOR_MAPPING_NAMES
- Add support for fast image processing in image-pretraining example
Fixes #37020
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
## Who can review?
Anyone in the community is free to review the PR once the tests have passed.
Maintained examples:
@amyeroberts, @qubvel
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37021/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37021/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37020
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37020/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37020/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37020/events
|
https://github.com/huggingface/transformers/issues/37020
| 2,950,965,125
|
I_kwDOCUB6oc6v5CeF
| 37,020
|
run_mim.py script from image-pretraining example is not working
|
{
"login": "jafraustro",
"id": 110444811,
"node_id": "U_kgDOBpVBCw",
"avatar_url": "https://avatars.githubusercontent.com/u/110444811?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jafraustro",
"html_url": "https://github.com/jafraustro",
"followers_url": "https://api.github.com/users/jafraustro/followers",
"following_url": "https://api.github.com/users/jafraustro/following{/other_user}",
"gists_url": "https://api.github.com/users/jafraustro/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jafraustro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jafraustro/subscriptions",
"organizations_url": "https://api.github.com/users/jafraustro/orgs",
"repos_url": "https://api.github.com/users/jafraustro/repos",
"events_url": "https://api.github.com/users/jafraustro/events{/privacy}",
"received_events_url": "https://api.github.com/users/jafraustro/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T23:10:16
| 2025-04-03T12:26:47
| 2025-04-03T12:26:47
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.51.0.dev0
- Platform: Linux-5.15.0-73-generic-x86_64-with-glibc2.35
- Python version: 3.12.9
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0.dev20250224+xpu (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
### Who can help?
@amyeroberts, @qubvel
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Follow README instructions to run `run_mim.py`
```bash
!python run_mim.py \
--model_type vit \
--output_dir ./outputs/ \
--overwrite_output_dir \
--remove_unused_columns False \
--label_names bool_masked_pos \
--do_train \
--do_eval \
--learning_rate 2e-5 \
--weight_decay 0.05 \
--num_train_epochs 100 \
--per_device_train_batch_size 8 \
--per_device_eval_batch_size 8 \
--logging_strategy steps \
--logging_steps 10 \
--eval_strategy epoch \
--save_strategy epoch \
--load_best_model_at_end True \
--save_total_limit 3 \
--seed 1337
```
### Expected behavior
The example is broken.
Output:
```
File "/repos/transformers/examples/pytorch/image-pretraining/run_mim.py", line 483, in <module>
main()
File "/repos/transformers/examples/pytorch/image-pretraining/run_mim.py", line 362, in main
image_processor = IMAGE_PROCESSOR_TYPES[model_args.model_type]()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'tuple' object is not callable
```
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37020/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37020/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37019
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37019/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37019/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37019/events
|
https://github.com/huggingface/transformers/pull/37019
| 2,950,849,708
|
PR_kwDOCUB6oc6QRpfk
| 37,019
|
Fix PixtralProcessor patch_size when spatial_merge_size is used
|
{
"login": "mgoin",
"id": 3195154,
"node_id": "MDQ6VXNlcjMxOTUxNTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/3195154?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mgoin",
"html_url": "https://github.com/mgoin",
"followers_url": "https://api.github.com/users/mgoin/followers",
"following_url": "https://api.github.com/users/mgoin/following{/other_user}",
"gists_url": "https://api.github.com/users/mgoin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mgoin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mgoin/subscriptions",
"organizations_url": "https://api.github.com/users/mgoin/orgs",
"repos_url": "https://api.github.com/users/mgoin/repos",
"events_url": "https://api.github.com/users/mgoin/events{/privacy}",
"received_events_url": "https://api.github.com/users/mgoin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T21:46:35
| 2025-03-27T09:46:23
| 2025-03-27T09:46:23
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37019",
"html_url": "https://github.com/huggingface/transformers/pull/37019",
"diff_url": "https://github.com/huggingface/transformers/pull/37019.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37019.patch",
"merged_at": "2025-03-27T09:46:23"
}
|
# What does this PR do?
Fixes an image resizing issue for Mistral3 that I found when integrating into vLLM (https://github.com/vllm-project/vllm/pull/15505)
The issue was found when processing [this image](https://careforplant.com/wp-content/uploads/2024/01/1677123490_gagaru-club-p-krasivie-malenkie-tsvetochki-krasivo-52-min-1-1080x675.jpg). The mistral processor was producing 975 image tokens and the hf processor was producing 936 tokens, causing a mismatch in the number of expected placeholder tokens in the input_ids.
As reference in mistral-common, I see that they always make sure to take `spatial_merge_size` into account when resizing the image.
https://github.com/mistralai/mistral-common/blob/6e637437fe4795353b7ba19cc8479c124b95b580/src/mistral_common/tokens/tokenizers/multimodal.py#L143-L144
and
https://github.com/mistralai/mistral-common/blob/6e637437fe4795353b7ba19cc8479c124b95b580/src/mistral_common/tokens/tokenizers/multimodal.py#L164-L167
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @muellerzr and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @muellerzr
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37019/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37019/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37018
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37018/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37018/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37018/events
|
https://github.com/huggingface/transformers/pull/37018
| 2,950,735,285
|
PR_kwDOCUB6oc6QRQRz
| 37,018
|
Add args support for fast image processors
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T20:34:50
| 2025-05-16T16:01:46
| 2025-05-16T16:01:46
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37018",
"html_url": "https://github.com/huggingface/transformers/pull/37018",
"diff_url": "https://github.com/huggingface/transformers/pull/37018.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37018.patch",
"merged_at": "2025-05-16T16:01:46"
}
|
This PR adds support for arguments to be added to the call function and not to the init function of fast image processors.
Before, the only arg to be accepted was `images`, but in some cases like for segmentation models, more inputs are necessary such as segmentations annotations or bounding boxes. This fixes the issue by adding support for abitrary args in the base fast image processors `preprocess` function, which are directly passed to the `_preprocess` function without validation. Thus no warnings will be displayed for invalid kwargs.
However this means that those model-specific args (present in the call function and not in the init) should be at the top of the args list in _preprocess, and in the correct order (same as in `preprocess`)
cc @qubvel
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37018/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37018/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37017
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37017/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37017/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37017/events
|
https://github.com/huggingface/transformers/issues/37017
| 2,950,678,427
|
I_kwDOCUB6oc6v38eb
| 37,017
|
SwitchTransformer: Initialization of tensor to collect expert results is incorrect for dropped tokens (from ML POV)
|
{
"login": "mario-aws",
"id": 172859788,
"node_id": "U_kgDOCk2hjA",
"avatar_url": "https://avatars.githubusercontent.com/u/172859788?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mario-aws",
"html_url": "https://github.com/mario-aws",
"followers_url": "https://api.github.com/users/mario-aws/followers",
"following_url": "https://api.github.com/users/mario-aws/following{/other_user}",
"gists_url": "https://api.github.com/users/mario-aws/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mario-aws/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mario-aws/subscriptions",
"organizations_url": "https://api.github.com/users/mario-aws/orgs",
"repos_url": "https://api.github.com/users/mario-aws/repos",
"events_url": "https://api.github.com/users/mario-aws/events{/privacy}",
"received_events_url": "https://api.github.com/users/mario-aws/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T20:02:49
| 2025-04-11T12:17:17
| 2025-04-10T14:58:59
|
CONTRIBUTOR
| null | null | null | null |
### System Info
This is a about a logical bug from ML point of view. It will not result in crashes but influence model behavior significantly.
In the [transformers code of SwitchTransfomer](https://github.com/huggingface/transformers/blame/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py#L307), we initialize the vector for collecting expert results for an MLP with the hidden states and then update over index updates and eventual router probability scaling.
```
next_states = hidden_states.clone()
...
for idx in idx_mask:
next_states[router_mask[:, :, idx]] = getattr(self.experts, "expert_{}".format(idx))(
hidden_states[router_mask[:, :, idx]]
)
hidden_states = router_probs * next_states
```
While this logic is fine for all tokens that are not dropped, it is wrong for dropped tokens. Setting the expert_capacity to zero for an extreme test where all tokens get dropped, one would get as output the original hidden states scaled by the probability of the respective expert they got never assigned to. Note that router_probs is not set to zero for dropped tokens. Also note, that at a different stage, we have the residual connection. It is not related to this part of the code. Why is this wrong from an ML POV?
1. Dropping means that the tokens are not updated. Setting it to hidden state provides an "identity" expert update.
2. The weight should correspond to the respective weight of the expert. Since the update does not get executed for this token the weight should be set to zero and not to the max weight for this token.
3. If we have an expert with a lot of dropping, it would partially behave normal and partially behave like the identity function, which can be very different. From an ML point of view, we want to have only one behavior for the expert.
4. Scaling of results can be quite different between an expert and an identity function.
5. Whereas this error does not impact the expert weights, it has an influence on the router. The quality of dropped tokens is probably degraded and the expert gets a reduced weight. This could result in unexpected load balancing.
This has probably not be seen so far, because usually only very few tokens are intended to drop. So, changing this behavior will probably not have much impact in the grand scheme of things. A fix could look like:
```
next_states = torch.zeros(hidden_states.shape, device=hidden_states.device, dtype=hidden_states.dtype)
```
transformers-cli env output (probably not relevant)
- `transformers` version: 4.46.2
- Platform: Linux-6.2.0-1018-aws-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.6.0+cpu (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
- text models: @ArthurZucker
- original author: @younesbelkada
- last major changing person: @zucchini-nlp
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I do not have a nice example yet but i should look something like:
```
config = SwitchTransformersConfig()
config.expert_capacity = 0
model = SwitchTransformersSparseMLP(config)
shape = (4096, 4096)
seed = 42
generator = torch.Generator().manual_seed(seed)
data = torch.randn(shape, generator=generator, dtype=torch.float32)
assert (model(data)[0] == 0).all(), "All tokens need to be properly dropped."
```
### Expected behavior
The result of the module should be all zeroes, if all tokens are dropped and not some arbitrary scaling of the data.
```
assert (model(data)[0] == 0).all(), "All tokens need to be properly dropped."
```
Would be the respective assertion probably.
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37017/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37017/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37016
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37016/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37016/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37016/events
|
https://github.com/huggingface/transformers/issues/37016
| 2,950,402,976
|
I_kwDOCUB6oc6v25Og
| 37,016
|
omlab/omdet-turbo-swin-tiny-hf from_pretrained fails to build model
|
{
"login": "Jordan-Pierce",
"id": 115024024,
"node_id": "U_kgDOBtsgmA",
"avatar_url": "https://avatars.githubusercontent.com/u/115024024?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jordan-Pierce",
"html_url": "https://github.com/Jordan-Pierce",
"followers_url": "https://api.github.com/users/Jordan-Pierce/followers",
"following_url": "https://api.github.com/users/Jordan-Pierce/following{/other_user}",
"gists_url": "https://api.github.com/users/Jordan-Pierce/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jordan-Pierce/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jordan-Pierce/subscriptions",
"organizations_url": "https://api.github.com/users/Jordan-Pierce/orgs",
"repos_url": "https://api.github.com/users/Jordan-Pierce/repos",
"events_url": "https://api.github.com/users/Jordan-Pierce/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jordan-Pierce/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T17:53:29
| 2025-03-27T13:58:30
| 2025-03-27T13:58:29
|
NONE
| null | null | null | null |
### System Info
Windows 11
python-3.10
tokenizers-0.21.1
transformers-4.50.1
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
>>> from transformers import AutoProcessor, OmDetTurboForObjectDetection
>>> OmDetTurboForObjectDetection.from_pretrained("omlab/omdet-turbo-swin-tiny-hf")
### Expected behavior
For it not to throw this error:
```bash
File "C:\Users\user\Miniconda3\envs\coralnet10\lib\site-packages\timm\models\swin_transformer.py", line 51, in window_partition
x = x.view(B, H // window_size, window_size, W // window_size, window_size, C)
RuntimeError: shape '[1, 22, 7, 22, 7, 1]' is invalid for input of size 25600
```
|
{
"login": "Jordan-Pierce",
"id": 115024024,
"node_id": "U_kgDOBtsgmA",
"avatar_url": "https://avatars.githubusercontent.com/u/115024024?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jordan-Pierce",
"html_url": "https://github.com/Jordan-Pierce",
"followers_url": "https://api.github.com/users/Jordan-Pierce/followers",
"following_url": "https://api.github.com/users/Jordan-Pierce/following{/other_user}",
"gists_url": "https://api.github.com/users/Jordan-Pierce/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jordan-Pierce/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jordan-Pierce/subscriptions",
"organizations_url": "https://api.github.com/users/Jordan-Pierce/orgs",
"repos_url": "https://api.github.com/users/Jordan-Pierce/repos",
"events_url": "https://api.github.com/users/Jordan-Pierce/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jordan-Pierce/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37016/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37016/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37015
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37015/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37015/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37015/events
|
https://github.com/huggingface/transformers/issues/37015
| 2,950,375,321
|
I_kwDOCUB6oc6v2yeZ
| 37,015
|
Add NeoBERT
|
{
"login": "capemox",
"id": 91133513,
"node_id": "MDQ6VXNlcjkxMTMzNTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/91133513?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/capemox",
"html_url": "https://github.com/capemox",
"followers_url": "https://api.github.com/users/capemox/followers",
"following_url": "https://api.github.com/users/capemox/following{/other_user}",
"gists_url": "https://api.github.com/users/capemox/gists{/gist_id}",
"starred_url": "https://api.github.com/users/capemox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/capemox/subscriptions",
"organizations_url": "https://api.github.com/users/capemox/orgs",
"repos_url": "https://api.github.com/users/capemox/repos",
"events_url": "https://api.github.com/users/capemox/events{/privacy}",
"received_events_url": "https://api.github.com/users/capemox/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null |
[] | 2025-03-26T17:42:50
| 2025-04-12T11:59:56
| null |
CONTRIBUTOR
| null | null | null | null |
### Model description
[NeoBERT](https://arxiv.org/abs/2502.19587) is a next-generation encoder model, outperforming BERT large, RoBERTa large, NomicBERT, and ModernBERT under identical fine-tuning conditions on MTEB.
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
Code and weights: https://huggingface.co/chandar-lab/NeoBERT
Some of the authors: @lolalebreton, @qfournier
The authors have hosted the code on the HF Hub for now, but they've let me go ahead and provide a PR to transformers. If this is ok, I can make a PR!
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37015/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37015/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37014
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37014/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37014/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37014/events
|
https://github.com/huggingface/transformers/pull/37014
| 2,950,351,264
|
PR_kwDOCUB6oc6QP8bf
| 37,014
|
[generate, cache] handle more complex device maps
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T17:34:28
| 2025-03-27T14:33:24
| 2025-03-27T14:33:21
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37014",
"html_url": "https://github.com/huggingface/transformers/pull/37014",
"diff_url": "https://github.com/huggingface/transformers/pull/37014.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37014.patch",
"merged_at": "2025-03-27T14:33:20"
}
|
# What does this PR do?
Related to #36942
We need to allocate static caches on the correct device. In `generate`, we peek at the model's `hf_device_map` to find the correct device for each layer, so that the cache stays on the same device as the layer it is caching.
Previously, given that decoder-only LLMs were the norm, we were assuming that device mappings ending with `(...).some_int` contained the device map for the layer with index `some_int`. This is not necessarily correct: in multimodal models, we can have this pattern for other components of the model, like the vision tower. We need to know the device of the decoder, and not of other parts of the model.
This PR expands the device detection to first look for the decoder module name, and then look for the layer pattern. It relies on `model.get_decoder()` -- models without this method will use the previous device mapping strategy.
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37014/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37014/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37012
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37012/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37012/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37012/events
|
https://github.com/huggingface/transformers/pull/37012
| 2,950,277,552
|
PR_kwDOCUB6oc6QPsOp
| 37,012
|
Add Fast Chinese-CLIP Processor
|
{
"login": "keetrap",
"id": 103131112,
"node_id": "U_kgDOBiWn6A",
"avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keetrap",
"html_url": "https://github.com/keetrap",
"followers_url": "https://api.github.com/users/keetrap/followers",
"following_url": "https://api.github.com/users/keetrap/following{/other_user}",
"gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keetrap/subscriptions",
"organizations_url": "https://api.github.com/users/keetrap/orgs",
"repos_url": "https://api.github.com/users/keetrap/repos",
"events_url": "https://api.github.com/users/keetrap/events{/privacy}",
"received_events_url": "https://api.github.com/users/keetrap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T17:02:39
| 2025-04-15T16:31:21
| 2025-04-15T16:31:20
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37012",
"html_url": "https://github.com/huggingface/transformers/pull/37012",
"diff_url": "https://github.com/huggingface/transformers/pull/37012.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37012.patch",
"merged_at": "2025-04-15T16:31:20"
}
|
Related #36978
cc @qubvel @yonigozlan
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37012/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37012/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37011
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37011/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37011/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37011/events
|
https://github.com/huggingface/transformers/issues/37011
| 2,950,253,252
|
I_kwDOCUB6oc6v2UrE
| 37,011
|
Gemma3 adding new tokens <image_soft_token> has been added accidentally
|
{
"login": "Serzhanov",
"id": 68291178,
"node_id": "MDQ6VXNlcjY4MjkxMTc4",
"avatar_url": "https://avatars.githubusercontent.com/u/68291178?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Serzhanov",
"html_url": "https://github.com/Serzhanov",
"followers_url": "https://api.github.com/users/Serzhanov/followers",
"following_url": "https://api.github.com/users/Serzhanov/following{/other_user}",
"gists_url": "https://api.github.com/users/Serzhanov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Serzhanov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Serzhanov/subscriptions",
"organizations_url": "https://api.github.com/users/Serzhanov/orgs",
"repos_url": "https://api.github.com/users/Serzhanov/repos",
"events_url": "https://api.github.com/users/Serzhanov/events{/privacy}",
"received_events_url": "https://api.github.com/users/Serzhanov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-03-26T16:54:40
| 2025-04-04T09:05:10
| 2025-04-04T09:05:09
|
NONE
| null | null | null | null |
### System Info
Hello,
When adding custom tokens to the `gemma_3b_1_it` tokenizer, an unexpected token (`<image_soft_token>`) appears in the model's embedding matrix — even though it was not explicitly added.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Model :
```
from transformers import AutoTokenizer, BitsAndBytesConfig, Gemma3ForCausalLM
import torch
model_id = "google/gemma-3-1b-it"
quantization_config = BitsAndBytesConfig(load_in_8bit=True)
model = Gemma3ForCausalLM.from_pretrained(
model_id, quantization_config=quantization_config,token='#your token'
)
tokenizer = AutoTokenizer.from_pretrained(model_id,token='#your token')
```
To Reproduce:
```
old_input_embedding = model.get_input_embeddings().weight.detach().clone()
old_output_embedding = model.get_output_embeddings().weight.detach().clone()
old_input_length = old_input_embedding.shape[0]
new_tokens = ["<CHARACTER_1>", "<THINKING>", "<SCRATCH_PAD>"]
old_tokenizer_length = len(tokenizer)
tokenizer.add_tokens(new_tokens)
model.resize_token_embeddings(len(tokenizer))
new_input_embedding = model.get_input_embeddings().weight.detach()
new_output_embedding = model.get_output_embeddings().weight.detach()
num_added = new_input_embedding.shape[0] - old_input_length
if num_added > 0:
new_rows = new_input_embedding[-num_added:] # New token embeddings
new_token_ids = range(old_input_length, old_input_length + num_added)
new_tokens_by_embedding = tokenizer.convert_ids_to_tokens(list(new_token_ids))
print(f" Based on embeddings, {num_added} new token(s) were added:")
for token_id, token in zip(new_token_ids, new_tokens_by_embedding):
print(f" - Token ID {token_id}: '{token}'")
else:
print("No new embeddings were added (embedding size unchanged)")
```
Output :
```
Based on embeddings, 4 new token(s) were added:
- Token ID 262144: '<image_soft_token>'
- Token ID 262145: '<CHARACTER_1>'
- Token ID 262146: '<THINKING>'
- Token ID 262147: '<SCRATCH_PAD>'
```
Expected behavior
```
Based on embeddings, 3 new token(s) were added:
- Token ID 262145: '<CHARACTER_1>'
- Token ID 262146: '<THINKING>'
- Token ID 262147: '<SCRATCH_PAD>'
```
|
{
"login": "Serzhanov",
"id": 68291178,
"node_id": "MDQ6VXNlcjY4MjkxMTc4",
"avatar_url": "https://avatars.githubusercontent.com/u/68291178?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Serzhanov",
"html_url": "https://github.com/Serzhanov",
"followers_url": "https://api.github.com/users/Serzhanov/followers",
"following_url": "https://api.github.com/users/Serzhanov/following{/other_user}",
"gists_url": "https://api.github.com/users/Serzhanov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Serzhanov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Serzhanov/subscriptions",
"organizations_url": "https://api.github.com/users/Serzhanov/orgs",
"repos_url": "https://api.github.com/users/Serzhanov/repos",
"events_url": "https://api.github.com/users/Serzhanov/events{/privacy}",
"received_events_url": "https://api.github.com/users/Serzhanov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37011/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37011/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.