url
string | repository_url
string | labels_url
string | comments_url
string | events_url
string | html_url
string | id
int64 | node_id
string | number
int64 | title
string | user
dict | labels
list | state
string | locked
bool | assignee
dict | assignees
list | milestone
null | comments
list | created_at
timestamp[ms] | updated_at
timestamp[ms] | closed_at
timestamp[ms] | author_association
string | type
dict | active_lock_reason
null | draft
bool | pull_request
dict | body
string | closed_by
dict | reactions
dict | timeline_url
string | performed_via_github_app
null | state_reason
string | sub_issues_summary
dict | issue_dependencies_summary
dict | is_pull_request
bool | is_closed
bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/37518
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37518/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37518/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37518/events
|
https://github.com/huggingface/transformers/issues/37518
| 2,995,562,659
|
I_kwDOCUB6oc6yjKij
| 37,518
|
Object of type BitsAndBytesConfig is not JSON serializable error with TensorBoard integration
|
{
"login": "astefanutti",
"id": 366207,
"node_id": "MDQ6VXNlcjM2NjIwNw==",
"avatar_url": "https://avatars.githubusercontent.com/u/366207?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/astefanutti",
"html_url": "https://github.com/astefanutti",
"followers_url": "https://api.github.com/users/astefanutti/followers",
"following_url": "https://api.github.com/users/astefanutti/following{/other_user}",
"gists_url": "https://api.github.com/users/astefanutti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/astefanutti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/astefanutti/subscriptions",
"organizations_url": "https://api.github.com/users/astefanutti/orgs",
"repos_url": "https://api.github.com/users/astefanutti/repos",
"events_url": "https://api.github.com/users/astefanutti/events{/privacy}",
"received_events_url": "https://api.github.com/users/astefanutti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T08:11:41
| 2025-04-16T09:18:18
| 2025-04-16T09:18:18
|
CONTRIBUTOR
| null | null | null | null |
### System Info
transformers==4.51.3
Python version: 3.11
### Who can help?
@zach-huggingface @SunMarc @MekkCyber
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
When using SFTTrainer with BitsAndBytes and TensorBoard integration, the TrainingArguments are serialized to JSON but fails with:
```
[rank0]: Traceback (most recent call last):
[rank0]: main({'model_name_or_path': 'meta-llama/Llama-4-Scout-17B-16E-Instruct', 'model_revision': 'main', 'torch_dtype': 'bfloat16', 'attn_implementation': 'flex_attention', 'use_liger': False, 'use_peft': False, 'lora_r': 16, 'lora_alpha': 8, 'lora_dropout': 0.05, 'lora_target_modules': ['q_proj', 'v_proj', 'k_proj', 'o_proj', 'gate_proj', 'up_proj', 'down_proj'], 'lora_modules_to_save': [], 'load_in_4bit': False, 'load_in_8bit': True, 'dataset_name': 'gsm8k', 'dataset_config': 'main', 'dataset_train_split': 'train', 'dataset_test_split': 'test', 'dataset_text_field': 'text', 'dataset_kwargs': {'add_special_tokens': False, 'append_concat_token': False}, 'max_seq_length': 512, 'dataset_batch_size': 1000, 'packing': False, 'num_train_epochs': 10, 'per_device_train_batch_size': 1, 'per_device_eval_batch_size': 1, 'auto_find_batch_size': False, 'eval_strategy': 'epoch', 'bf16': True, 'tf32': False, 'learning_rate': 0.0002, 'warmup_steps': 10, 'lr_scheduler_type': 'inverse_sqrt', 'optim': 'adamw_torch_fused', 'max_grad_norm': 1.0, 'seed': 42, 'gradient_accumulation_steps': 1, 'gradient_checkpointing': False, 'gradient_checkpointing_kwargs': {'use_reentrant': False}, 'fsdp': 'full_shard auto_wrap', 'fsdp_config': {'activation_checkpointing': True, 'cpu_ram_efficient_loading': False, 'sync_module_states': True, 'use_orig_params': True, 'limit_all_gathers': False}, 'save_strategy': 'epoch', 'save_total_limit': 1, 'resume_from_checkpoint': False, 'log_level': 'info', 'logging_strategy': 'steps', 'logging_steps': 1, 'report_to': ['tensorboard'], 'output_dir': '/mnt/shared/Llama-4-Scout-17B-16E-Instruct'})
[rank0]: File "/tmp/tmp.jsNRcydokN/ephemeral_script.py", line 126, in main
[rank0]: trainer.train(resume_from_checkpoint=checkpoint)
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/trainer.py", line 2238, in train
[rank0]: return inner_training_loop(
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/trainer.py", line 2462, in _inner_training_loop
[rank0]: self.control = self.callback_handler.on_train_begin(args, self.state, self.control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/trainer_callback.py", line 506, in on_train_begin
[rank0]: return self.call_event("on_train_begin", args, state, control)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/trainer_callback.py", line 556, in call_event
[rank0]: result = getattr(callback, event)(
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/integrations/integration_utils.py", line 698, in on_train_begin
[rank0]: self.tb_writer.add_text("args", args.to_json_string())
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/app-root/lib64/python3.11/site-packages/transformers/training_args.py", line 2509, in to_json_string
[rank0]: return json.dumps(self.to_dict(), indent=2)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/usr/lib64/python3.11/json/__init__.py", line 238, in dumps
[rank0]: **kw).encode(obj)
[rank0]: ^^^^^^^^^^^
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 202, in encode
[rank0]: chunks = list(chunks)
[rank0]: ^^^^^^^^^^^^
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 432, in _iterencode
[rank0]: yield from _iterencode_dict(o, _current_indent_level)
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 406, in _iterencode_dict
[rank0]: yield from chunks
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 406, in _iterencode_dict
[rank0]: yield from chunks
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 439, in _iterencode
[rank0]: o = _default(o)
[rank0]: ^^^^^^^^^^^
[rank0]: File "/usr/lib64/python3.11/json/encoder.py", line 180, in default
[rank0]: raise TypeError(f'Object of type {o.__class__.__name__} '
[rank0]: TypeError: Object of type BitsAndBytesConfig is not JSON serializable
```
### Expected behavior
The BitsAndBytesConfig should be converted to dict before TrainingArguments are serialized.
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37518/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37518/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37517
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37517/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37517/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37517/events
|
https://github.com/huggingface/transformers/pull/37517
| 2,995,540,633
|
PR_kwDOCUB6oc6SnyRX
| 37,517
|
[qwen-omni] fix training
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T08:02:57
| 2025-04-24T10:52:07
| 2025-04-22T10:36:07
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37517",
"html_url": "https://github.com/huggingface/transformers/pull/37517",
"diff_url": "https://github.com/huggingface/transformers/pull/37517.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37517.patch",
"merged_at": "2025-04-22T10:36:07"
}
|
# What does this PR do?
Fixes #37513 and fixes #37515. Ready for review!
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37517/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37517/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37516
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37516/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37516/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37516/events
|
https://github.com/huggingface/transformers/pull/37516
| 2,995,461,261
|
PR_kwDOCUB6oc6Sngz4
| 37,516
|
enable several cases on XPU
|
{
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T07:35:13
| 2025-04-16T22:39:11
| 2025-04-16T09:01:05
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37516",
"html_url": "https://github.com/huggingface/transformers/pull/37516",
"diff_url": "https://github.com/huggingface/transformers/pull/37516.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37516.patch",
"merged_at": "2025-04-16T09:01:05"
}
|
w/ this PR:
1. autoawq cases: 9 pass, 1 skip(should skip, since xpu doesn't support exllama, all optimized op will go through ipex backend), 2 fail
2. peft_integration cases: 2 pass, 1 fail(for bnb issue)
3. test_sdpa_can_dispatch_on_flash: 19 pass, 1 fail
fail cases:
tests/models/diffllama/test_modeling_diffllama.py::DiffLlamaModelTest::test_sdpa_can_dispatch_on_flash
tests/peft_integration/test_peft_integration.py::PeftIntegrationTester::test_peft_from_pretrained_kwargs
tests/quantization/autoawq/test_awq.py::AwqTest::test_quantized_model_bf16
tests/quantization/autoawq/test_awq.py::AwqTest::test_quantized_model_multi_gpu
We will follow 4 failure cases and submit fixing separate PRs.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37516/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37516/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37515
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37515/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37515/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37515/events
|
https://github.com/huggingface/transformers/issues/37515
| 2,995,395,643
|
I_kwDOCUB6oc6yihw7
| 37,515
|
AttributeError: 'Qwen2_5OmniConfig' object has no attribute 'num_attention_heads'
|
{
"login": "jieguolove",
"id": 96612712,
"node_id": "U_kgDOBcIxaA",
"avatar_url": "https://avatars.githubusercontent.com/u/96612712?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jieguolove",
"html_url": "https://github.com/jieguolove",
"followers_url": "https://api.github.com/users/jieguolove/followers",
"following_url": "https://api.github.com/users/jieguolove/following{/other_user}",
"gists_url": "https://api.github.com/users/jieguolove/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jieguolove/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jieguolove/subscriptions",
"organizations_url": "https://api.github.com/users/jieguolove/orgs",
"repos_url": "https://api.github.com/users/jieguolove/repos",
"events_url": "https://api.github.com/users/jieguolove/events{/privacy}",
"received_events_url": "https://api.github.com/users/jieguolove/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T07:09:01
| 2025-04-22T10:36:09
| 2025-04-22T10:36:09
|
NONE
| null | null | null | null |
### System Info
root@445d74596699:/vllm-workspace# transformers-cli env
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.52.0.dev0
- Platform: Linux-5.15.0-43-generic-x86_64-with-glibc2.35
- Python version: 3.12.9
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA L20
`(base) root@node15:/disk2/Qwen2.5-Omni-7B# more docker-compose.yml
#version: '3.3'
services:
# vllm
vllm-openai:
image: vllm/vllm-openai:v0.8.2
container_name: Qwen2.5-Omni-7B
restart: unless-stopped
runtime: nvidia
ports:
- 8007:8000
volumes:
- /disk2:/models
command: >
--model /models/Qwen2.5-Omni-7B
--tokenizer_mode="auto"
--trust-remote-code
--dtype=bfloat16
--max_num_seqs=256
--tensor_parallel_size=1
--gpu-memory-utilization=0.9
--max-model-len=65536
--served-model-name=Qwen2.5-Omni-7B
deploy:
resources:
reservations:
devices:
- driver: nvidia
capabilities: [gpu]
device_ids: [ "1" ]
ipc: host
networks:
vllm:
(base) root@node15:/disk2/Qwen2.5-Omni-7B# docker commit 445d74596699 vllm/vllm-openai:v0.8.2
sha256:fdf1171c4bc4edc473bb3857597124ae73176c1691a27befccb4360c81ff0d60
(base) root@node15:/disk2/Qwen2.5-Omni-7B# docker compose -f docker-compose.yml up -d
[+] Running 2/2
✔ Network qwen25-omni-7b_default Created 0.0s
✔ Container Qwen2.5-Omni-7B Started 0.6s
(base) root@node15:/disk2/Qwen2.5-Omni-7B# docker logs -f Qwen2.5-Omni-7B
INFO 04-15 00:06:11 [__init__.py:239] Automatically detected platform cuda.
INFO 04-15 00:06:13 [api_server.py:981] vLLM API server version 0.8.2
INFO 04-15 00:06:13 [api_server.py:982] args: Namespace(host=None, port=8000, uvicorn_log_level='info', disable_uvicorn_access_log=False, allow_credentials=False, allowed_origins=['*'], allowed_methods=['*'], allowed_headers=['*'], api_key=None, lora_modules=None, prompt_adapters=None, chat_template=None, chat_template_content_format='auto', response_role='assistant', ssl_keyfile=None, ssl_certfile=None, ssl_ca_certs=None, enable_ssl_refresh=False, ssl_cert_reqs=0, root_path=None, middleware=[], return_tokens_as_token_ids=False, disable_frontend_multiprocessing=False, enable_request_id_headers=False, enable_auto_tool_choice=False, tool_call_parser=None, tool_parser_plugin='', model='/models/Qwen2.5-Omni-7B', task='auto', tokenizer=None, hf_config_path=None, skip_tokenizer_init=False, revision=None, code_revision=None, tokenizer_revision=None, tokenizer_mode='auto', trust_remote_code=True, allowed_local_media_path=None, download_dir=None, load_format='auto', config_format=<ConfigFormat.AUTO: 'auto'>, dtype='bfloat16', kv_cache_dtype='auto', max_model_len=65536, guided_decoding_backend='xgrammar', logits_processor_pattern=None, model_impl='auto', distributed_executor_backend=None, pipeline_parallel_size=1, tensor_parallel_size=1, enable_expert_parallel=False, max_parallel_loading_workers=None, ray_workers_use_nsight=False, block_size=None, enable_prefix_caching=None, disable_sliding_window=False, use_v2_block_manager=True, num_lookahead_slots=0, seed=None, swap_space=4, cpu_offload_gb=0, gpu_memory_utilization=0.9, num_gpu_blocks_override=None, max_num_batched_tokens=None, max_num_partial_prefills=1, max_long_partial_prefills=1, long_prefill_token_threshold=0, max_num_seqs=256, max_logprobs=20, disable_log_stats=False, quantization=None, rope_scaling=None, rope_theta=None, hf_overrides=None, enforce_eager=False, max_seq_len_to_capture=8192, disable_custom_all_reduce=False, tokenizer_pool_size=0, tokenizer_pool_type='ray', tokenizer_pool_extra_config=None, limit_mm_per_prompt=None, mm_processor_kwargs=None, disable_mm_preprocessor_cache=False, enable_lora=False, enable_lora_bias=False, max_loras=1, max_lora_rank=16, lora_extra_vocab_size=256, lora_dtype='auto', long_lora_scaling_factors=None, max_cpu_loras=None, fully_sharded_loras=False, enable_prompt_adapter=False, max_prompt_adapters=1, max_prompt_adapter_token=0, device='auto', num_scheduler_steps=1, use_tqdm_on_load=True, multi_step_stream_outputs=True, scheduler_delay_factor=0.0, enable_chunked_prefill=None, speculative_config=None, speculative_model=None, speculative_model_quantization=None, num_speculative_tokens=None, speculative_disable_mqa_scorer=False, speculative_draft_tensor_parallel_size=None, speculative_max_model_len=None, speculative_disable_by_batch_size=None, ngram_prompt_lookup_max=None, ngram_prompt_lookup_min=None, spec_decoding_acceptance_method='rejection_sampler', typical_acceptance_sampler_posterior_threshold=None, typical_acceptance_sampler_posterior_alpha=None, disable_logprobs_during_spec_decoding=None, model_loader_extra_config=None, ignore_patterns=[], preemption_mode=None, served_model_name=['Qwen2.5-Omni-7B'], qlora_adapter_name_or_path=None, show_hidden_metrics_for_version=None, otlp_traces_endpoint=None, collect_detailed_traces=None, disable_async_output_proc=False, scheduling_policy='fcfs', scheduler_cls='vllm.core.scheduler.Scheduler', override_neuron_config=None, override_pooler_config=None, compilation_config=None, kv_transfer_config=None, worker_cls='auto', worker_extension_cls='', generation_config='auto', override_generation_config=None, enable_sleep_mode=False, calculate_kv_scales=False, additional_config=None, enable_reasoning=False, reasoning_parser=None, disable_cascade_attn=False, disable_log_requests=False, max_log_len=None, disable_fastapi_docs=False, enable_prompt_tokens_details=False, enable_server_load_tracking=False)
Unrecognized keys in `rope_scaling` for 'rope_type'='default': {'mrope_section'}
INFO 04-15 00:06:22 [config.py:585] This model supports multiple tasks: {'reward', 'generate', 'classify', 'score', 'embed'}. Defaulting to 'generate'.
INFO 04-15 00:06:22 [config.py:1697] Chunked prefill is enabled with max_num_batched_tokens=2048.
INFO 04-15 00:06:24 [core.py:54] Initializing a V1 LLM engine (v0.8.2) with config: model='/models/Qwen2.5-Omni-7B', speculative_config=None, tokenizer='/models/Qwen2.5-Omni-7B', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, override_neuron_config=None, tokenizer_revision=None, trust_remote_code=True, dtype=torch.bfloat16, max_seq_len=65536, download_dir=None, load_format=LoadFormat.AUTO, tensor_parallel_size=1, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=None, enforce_eager=False, kv_cache_dtype=auto, device_config=cuda, decoding_config=DecodingConfig(guided_decoding_backend='xgrammar', reasoning_backend=None), observability_config=ObservabilityConfig(show_hidden_metrics=False, otlp_traces_endpoint=None, collect_model_forward_time=False, collect_model_execute_time=False), seed=None, served_model_name=Qwen2.5-Omni-7B, num_scheduler_steps=1, multi_step_stream_outputs=True, enable_prefix_caching=True, chunked_prefill_enabled=True, use_async_output_proc=True, disable_mm_preprocessor_cache=False, mm_processor_kwargs=None, pooler_config=None, compilation_config={"level":3,"custom_ops":["none"],"splitting_ops":["vllm.unified_attention","vllm.unified_attention_with_output"],"use_inductor":true,"compile_sizes":[],"use_cudagraph":true,"cudagraph_num_of_warmups":1,"cudagraph_capture_sizes":[512,504,496,488,480,472,464,456,448,440,432,424,416,408,400,392,384,376,368,360,352,344,336,328,320,312,304,296,288,280,272,264,256,248,240,232,224,216,208,200,192,184,176,168,160,152,144,136,128,120,112,104,96,88,80,72,64,56,48,40,32,24,16,8,4,2,1],"max_capture_size":512}
WARNING 04-15 00:06:25 [utils.py:2321] Methods determine_num_available_blocks,device_config,get_cache_block_size_bytes,initialize_cache not implemented in <vllm.v1.worker.gpu_worker.Worker object at 0x7fabea685df0>
INFO 04-15 00:06:26 [parallel_state.py:954] rank 0 in world size 1 is assigned as DP rank 0, PP rank 0, TP rank 0
ERROR 04-15 00:06:26 [core.py:343] EngineCore hit an exception: Traceback (most recent call last):
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/v1/engine/core.py", line 335, in run_engine_core
ERROR 04-15 00:06:26 [core.py:343] engine_core = EngineCoreProc(*args, **kwargs)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/v1/engine/core.py", line 290, in __init__
ERROR 04-15 00:06:26 [core.py:343] super().__init__(vllm_config, executor_class, log_stats)
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/v1/engine/core.py", line 60, in __init__
ERROR 04-15 00:06:26 [core.py:343] self.model_executor = executor_class(vllm_config)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/executor/executor_base.py", line 52, in __init__
ERROR 04-15 00:06:26 [core.py:343] self._init_executor()
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/executor/uniproc_executor.py", line 46, in _init_executor
ERROR 04-15 00:06:26 [core.py:343] self.collective_rpc("init_device")
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/executor/uniproc_executor.py", line 56, in collective_rpc
ERROR 04-15 00:06:26 [core.py:343] answer = run_method(self.driver_worker, method, args, kwargs)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/utils.py", line 2255, in run_method
ERROR 04-15 00:06:26 [core.py:343] return func(*args, **kwargs)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/worker/worker_base.py", line 604, in init_device
ERROR 04-15 00:06:26 [core.py:343] self.worker.init_device() # type: ignore
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/v1/worker/gpu_worker.py", line 120, in init_device
ERROR 04-15 00:06:26 [core.py:343] self.model_runner: GPUModelRunner = GPUModelRunner(
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/v1/worker/gpu_model_runner.py", line 106, in __init__
ERROR 04-15 00:06:26 [core.py:343] self.num_kv_heads = model_config.get_num_kv_heads(parallel_config)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 884, in get_num_kv_heads
ERROR 04-15 00:06:26 [core.py:343] total_num_kv_heads = self.get_total_num_kv_heads()
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 876, in get_total_num_kv_heads
ERROR 04-15 00:06:26 [core.py:343] return self.hf_text_config.num_attention_heads
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] File "/usr/local/lib/python3.12/dist-packages/transformers/configuration_utils.py", line 211, in __getattribute__
ERROR 04-15 00:06:26 [core.py:343] return super().__getattribute__(key)
ERROR 04-15 00:06:26 [core.py:343] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ERROR 04-15 00:06:26 [core.py:343] AttributeError: 'Qwen2_5OmniConfig' object has no attribute 'num_attention_heads'
ERROR 04-15 00:06:26 [core.py:343]
CRITICAL 04-15 00:06:26 [core_client.py:269] Got fatal signal from worker processes, shutting down. See stack trace above for root cause issue.
`
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction

### Expected behavior
no error
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37515/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37515/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37514
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37514/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37514/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37514/events
|
https://github.com/huggingface/transformers/pull/37514
| 2,995,288,846
|
PR_kwDOCUB6oc6Sm7Q_
| 37,514
|
enable test_offloaded_cache_implementation test case on XPU
|
{
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T06:24:53
| 2025-04-16T22:36:06
| 2025-04-16T09:04:57
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37514",
"html_url": "https://github.com/huggingface/transformers/pull/37514",
"diff_url": "https://github.com/huggingface/transformers/pull/37514.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37514.patch",
"merged_at": "2025-04-16T09:04:57"
}
|
78 cases passed, 1 case(tests/models/dbrx/test_modeling_dbrx.py::DbrxModelTest::test_offloaded_cache_implementation_0_offloaded
) failed, will fix the 1 failed case in a separate PR.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37514/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37514/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37513
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37513/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37513/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37513/events
|
https://github.com/huggingface/transformers/issues/37513
| 2,995,163,467
|
I_kwDOCUB6oc6yhpFL
| 37,513
|
Qwen2_5Omni training forward issue
|
{
"login": "Kuangdd01",
"id": 82590017,
"node_id": "MDQ6VXNlcjgyNTkwMDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/82590017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kuangdd01",
"html_url": "https://github.com/Kuangdd01",
"followers_url": "https://api.github.com/users/Kuangdd01/followers",
"following_url": "https://api.github.com/users/Kuangdd01/following{/other_user}",
"gists_url": "https://api.github.com/users/Kuangdd01/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kuangdd01/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kuangdd01/subscriptions",
"organizations_url": "https://api.github.com/users/Kuangdd01/orgs",
"repos_url": "https://api.github.com/users/Kuangdd01/repos",
"events_url": "https://api.github.com/users/Kuangdd01/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kuangdd01/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T05:22:59
| 2025-04-22T10:36:08
| 2025-04-22T10:36:08
|
CONTRIBUTOR
| null | null | null | null |
I encountered this error when post-training Qwen2_5Omni, I guess it is just a typo.
https://github.com/huggingface/transformers/blob/4e63a1747ce6a4b5f75e8d2318857c2b76c3ba23/src/transformers/models/qwen2_5_omni/modeling_qwen2_5_omni.py#L2505
```
AttributeError: 'Qwen2_5OmniThinkerConfig' object has no attribute 'vocab_size'
```
==>
```python
loss = self.loss_function(logits=logits, labels=labels, vocab_size=self.vocab_size)
```
Another question is whether we can add `self.post_init()` in the initialization of `Qwen2_5OmniForConditionalGeneration`, or I met another error of `model._tp_plan is None` when I tried to train this model with multiple GPUs.
``` python
def __init__(self, config):
super().__init__(config)
self.thinker = Qwen2_5OmniThinkerForConditionalGeneration(config.thinker_config)
self.has_talker = config.enable_audio_output
self.speaker_map = {}
if config.enable_audio_output:
self.enable_talker()
self.post_init() # Will this line lead to unexpected behaviors?
```
@zucchini-nlp
## env
```
- `transformers` version: 4.52.0.dev0
- Platform: Linux-5.15.0-134-generic-x86_64-with-glibc2.39
- Python version: 3.10.0
- Huggingface_hub version: 0.30.0
- Safetensors version: 0.5.3
- Accelerate version: 1.4.0
- DeepSpeed version: 0.16.5
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- GPU type: NVIDIA A100-SXM4-40GB
```
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37513/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37513/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37512
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37512/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37512/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37512/events
|
https://github.com/huggingface/transformers/pull/37512
| 2,995,107,775
|
PR_kwDOCUB6oc6SmTRU
| 37,512
|
Update tokenization_utils_base.py
|
{
"login": "foldl",
"id": 4046440,
"node_id": "MDQ6VXNlcjQwNDY0NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/4046440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/foldl",
"html_url": "https://github.com/foldl",
"followers_url": "https://api.github.com/users/foldl/followers",
"following_url": "https://api.github.com/users/foldl/following{/other_user}",
"gists_url": "https://api.github.com/users/foldl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/foldl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/foldl/subscriptions",
"organizations_url": "https://api.github.com/users/foldl/orgs",
"repos_url": "https://api.github.com/users/foldl/repos",
"events_url": "https://api.github.com/users/foldl/events{/privacy}",
"received_events_url": "https://api.github.com/users/foldl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-15T04:51:18
| 2025-06-02T09:13:21
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37512",
"html_url": "https://github.com/huggingface/transformers/pull/37512",
"diff_url": "https://github.com/huggingface/transformers/pull/37512.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37512.patch",
"merged_at": null
}
|
select encoding explicitly.
# What does this PR do?
Always use `utf-8` when reading template files. Without this, on Windows, since the default encoding might not be `utf-8`, it won't be able to read files encoded in `utf-8`.
@ArthurZucker @Rocketknight1
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37512/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37512/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37524
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37524/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37524/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37524/events
|
https://github.com/huggingface/transformers/issues/37524
| 2,995,949,386
|
I_kwDOCUB6oc6yko9K
| 37,524
|
A type error in the Template writing document
|
{
"login": "Vegetabledog-BUAA",
"id": 79221214,
"node_id": "MDQ6VXNlcjc5MjIxMjE0",
"avatar_url": "https://avatars.githubusercontent.com/u/79221214?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vegetabledog-BUAA",
"html_url": "https://github.com/Vegetabledog-BUAA",
"followers_url": "https://api.github.com/users/Vegetabledog-BUAA/followers",
"following_url": "https://api.github.com/users/Vegetabledog-BUAA/following{/other_user}",
"gists_url": "https://api.github.com/users/Vegetabledog-BUAA/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vegetabledog-BUAA/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vegetabledog-BUAA/subscriptions",
"organizations_url": "https://api.github.com/users/Vegetabledog-BUAA/orgs",
"repos_url": "https://api.github.com/users/Vegetabledog-BUAA/repos",
"events_url": "https://api.github.com/users/Vegetabledog-BUAA/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vegetabledog-BUAA/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T03:11:24
| 2025-06-08T08:02:46
| 2025-06-08T08:02:46
|
NONE
| null | null | null | null | ERROR: type should be string, got "https://huggingface.co/docs/transformers/main/en/chat_templating_writing\n\nIn the example template as shown below.\n{%- for message in messages %}\n {{- '<|' + message['role'] + |>\\n' }}\n {{- message['content'] + eos_token }}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|assistant|>\\n' }}\n{%- endif %}\n\nA quote is missing in the template before the |>\\n' which after the message['role'].\n\n{%- for message in messages %}\n {{- '<|' + message['role'] + '|>\\n' }}\n {{- message['content'] + eos_token }}\n{%- endfor %}\n{%- if add_generation_prompt %}\n {{- '<|assistant|>\\n' }}\n{%- endif %}"
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37524/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37511
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37511/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37511/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37511/events
|
https://github.com/huggingface/transformers/pull/37511
| 2,994,864,581
|
PR_kwDOCUB6oc6SlcSS
| 37,511
|
fix: qwen2.5 omni apply_chat_template system content check
|
{
"login": "weedge",
"id": 1203957,
"node_id": "MDQ6VXNlcjEyMDM5NTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1203957?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/weedge",
"html_url": "https://github.com/weedge",
"followers_url": "https://api.github.com/users/weedge/followers",
"following_url": "https://api.github.com/users/weedge/following{/other_user}",
"gists_url": "https://api.github.com/users/weedge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/weedge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/weedge/subscriptions",
"organizations_url": "https://api.github.com/users/weedge/orgs",
"repos_url": "https://api.github.com/users/weedge/repos",
"events_url": "https://api.github.com/users/weedge/events{/privacy}",
"received_events_url": "https://api.github.com/users/weedge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-15T02:47:29
| 2025-08-08T02:50:35
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37511",
"html_url": "https://github.com/huggingface/transformers/pull/37511",
"diff_url": "https://github.com/huggingface/transformers/pull/37511.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37511.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
fix: qwen2.5 omni apply_chat_template system content check
test case:
```python
def thinker_inference_stream(
messages,
use_audio_in_video=False,
speaker=DEFAULT_SPEAKER,
):
text = processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
# image_inputs, video_inputs = process_vision_info([messages])
audios, images, videos = process_mm_info(messages, use_audio_in_video=use_audio_in_video)
inputs = processor(
text=text,
audio=audios,
images=images,
videos=videos,
return_tensors="pt",
padding=True,
use_audio_in_video=use_audio_in_video,
)
inputs = inputs.to(model.device).to(model.dtype)
streamer = TextIteratorStreamer(processor, skip_prompt=True, skip_special_tokens=True)
generation_kwargs = dict(
**inputs,
streamer=streamer,
use_audio_in_video=use_audio_in_video,
return_audio=False,
speaker=speaker,
thinker_do_sample=True,
# do_sample=True,
top_k=10,
top_p=0.9,
temperature=0.95,
repetition_penalty=1.1,
min_new_tokens=0,
max_new_tokens=2048,
)
thread = Thread(target=model.generate, kwargs=generation_kwargs)
thread.start()
generated_text = ""
times = []
start_time = perf_counter()
for new_text in streamer:
times.append(perf_counter() - start_time)
start_time = perf_counter()
generated_text += new_text
yield new_text
print(
f"generate [{generated_text}] first token cost time: {times[0]} s, {len(times)} tokens cost time: {sum(times)} s"
)
torch.cuda.empty_cache()
def asr_stream():
for case in [
{
"audio_path": "1272-128104-0000.flac",
"prompt": "Listen to the provided English speech and produce a translation in Chinese text.",
"sys_prompt": "You are a speech translation model.",
},
{
"audio_path": "BAC009S0764W0121.wav",
"prompt": "请将这段中文语音转换为纯文本,去掉标点符号。",
"sys_prompt": [{"type": "text", "text": "You are a speech recognition model."}],
},
]:
audio_path = os.path.join(ASSETS_DIR, case["audio_path"])
messages = [
{"role": "system", "content": case["sys_prompt"]},
{
"role": "user",
"content": [
{"type": "text", "text": case["prompt"]},
{"type": "audio", "audio": audio_path},
],
},
]
text_streamer = thinker_inference_stream(messages, use_audio_in_video=True)
for text in text_streamer:
print(text)
```
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37511/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37511/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37510
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37510/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37510/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37510/events
|
https://github.com/huggingface/transformers/issues/37510
| 2,994,673,813
|
I_kwDOCUB6oc6yfxiV
| 37,510
|
Trainer num_tokens() function seem to be outdated and not correct
|
{
"login": "taras-sereda",
"id": 7364100,
"node_id": "MDQ6VXNlcjczNjQxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/7364100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taras-sereda",
"html_url": "https://github.com/taras-sereda",
"followers_url": "https://api.github.com/users/taras-sereda/followers",
"following_url": "https://api.github.com/users/taras-sereda/following{/other_user}",
"gists_url": "https://api.github.com/users/taras-sereda/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taras-sereda/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taras-sereda/subscriptions",
"organizations_url": "https://api.github.com/users/taras-sereda/orgs",
"repos_url": "https://api.github.com/users/taras-sereda/repos",
"events_url": "https://api.github.com/users/taras-sereda/events{/privacy}",
"received_events_url": "https://api.github.com/users/taras-sereda/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T00:59:04
| 2025-05-24T08:02:22
| 2025-05-24T08:02:22
|
NONE
| null | null | null | null |
### System Info
I was reading sources of trainer and logic of `num_tokens()` seemed unclear to me. Why multiplication by `max_steps`?
https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py#L1771
Here is my PR https://github.com/huggingface/transformers/pull/37509 that attempts to make it clearer.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
~
### Expected behavior
~
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37510/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37510/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37509
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37509/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37509/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37509/events
|
https://github.com/huggingface/transformers/pull/37509
| 2,994,662,235
|
PR_kwDOCUB6oc6Skvie
| 37,509
|
[fix] Trainer num_tokens() count
|
{
"login": "taras-sereda",
"id": 7364100,
"node_id": "MDQ6VXNlcjczNjQxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/7364100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taras-sereda",
"html_url": "https://github.com/taras-sereda",
"followers_url": "https://api.github.com/users/taras-sereda/followers",
"following_url": "https://api.github.com/users/taras-sereda/following{/other_user}",
"gists_url": "https://api.github.com/users/taras-sereda/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taras-sereda/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taras-sereda/subscriptions",
"organizations_url": "https://api.github.com/users/taras-sereda/orgs",
"repos_url": "https://api.github.com/users/taras-sereda/repos",
"events_url": "https://api.github.com/users/taras-sereda/events{/privacy}",
"received_events_url": "https://api.github.com/users/taras-sereda/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-15T00:50:15
| 2025-04-15T13:38:41
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37509",
"html_url": "https://github.com/huggingface/transformers/pull/37509",
"diff_url": "https://github.com/huggingface/transformers/pull/37509.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37509.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37509/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37509/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37508
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37508/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37508/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37508/events
|
https://github.com/huggingface/transformers/pull/37508
| 2,994,661,343
|
PR_kwDOCUB6oc6SkvWE
| 37,508
|
Allow override inputs to export recipe
|
{
"login": "guangy10",
"id": 42389959,
"node_id": "MDQ6VXNlcjQyMzg5OTU5",
"avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guangy10",
"html_url": "https://github.com/guangy10",
"followers_url": "https://api.github.com/users/guangy10/followers",
"following_url": "https://api.github.com/users/guangy10/following{/other_user}",
"gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guangy10/subscriptions",
"organizations_url": "https://api.github.com/users/guangy10/orgs",
"repos_url": "https://api.github.com/users/guangy10/repos",
"events_url": "https://api.github.com/users/guangy10/events{/privacy}",
"received_events_url": "https://api.github.com/users/guangy10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-15T00:49:16
| 2025-04-30T08:19:33
| 2025-04-30T08:19:27
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37508",
"html_url": "https://github.com/huggingface/transformers/pull/37508",
"diff_url": "https://github.com/huggingface/transformers/pull/37508.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37508.patch",
"merged_at": "2025-04-30T08:19:27"
}
|
# What does this PR do?
Enable dynamism at seq_len dim in order to utilize parallel `prefill` in the executorch runtime. In this PR,
- allow caller side to override example inputs, dynamic shapes and strict flag, but keep the default unchanged for BC
- add unit test to cover export with dynamic shapes, and strict False as it's the mainstream in latest version of `torch.export`
- make the unit test non-slow, to avoid being skipped on PRs and causing regressions
- add test for `HybirdCache`
Tests
`pytest tests/utils/test_cache_utils.py -vv -s -k cache_exportability`
```
collected 23 items / 20 deselected / 3 selected
tests/utils/test_cache_utils.py::CacheExportIntegrationTest::test_dynamic_cache_exportability PASSED [ 33%]
tests/utils/test_cache_utils.py::CacheExportIntegrationTest::test_hybrid_cache_exportability PASSED [ 66%]
tests/utils/test_cache_utils.py::CacheExportIntegrationTest::test_static_cache_exportability PASSED [100%]
```
## Before submitting
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. Needed by the downstream in [optimum-executorch](https://github.com/huggingface/optimum-executorch). https://github.com/huggingface/optimum-executorch/issues/53
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you add new tests? Yes
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @ydshieh
CC: @tugsbayasgalan
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37508/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37508/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37507
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37507/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37507/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37507/events
|
https://github.com/huggingface/transformers/pull/37507
| 2,994,565,159
|
PR_kwDOCUB6oc6SkZuu
| 37,507
|
enable 5 cases on XPU
|
{
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T23:53:17
| 2025-04-16T22:34:12
| 2025-04-16T07:28:02
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37507",
"html_url": "https://github.com/huggingface/transformers/pull/37507",
"diff_url": "https://github.com/huggingface/transformers/pull/37507.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37507.patch",
"merged_at": "2025-04-16T07:28:02"
}
|
**command**
`pytest -rA tests/models/speecht5/test_modeling_speecht5.py::SpeechT5ForTextToSpeechIntegrationTests::test_batch_generation`
`pytest -rA tests/models/glm/test_modeling_glm.py::GlmIntegrationTest::test_model_9b_bf16`
`pytest -rA tests/models/glm/test_modeling_glm.py::GlmIntegrationTest::test_model_9b_eager`
`pytest -rA tests/models/glm/test_modeling_glm.py::GlmIntegrationTest::test_model_9b_fp16`
`pytest -rA tests/models/glm/test_modeling_glm.py::GlmIntegrationTest::test_model_9b_sdpa`
**results**
all pass on XPU
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37507/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37507/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37506
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37506/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37506/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37506/events
|
https://github.com/huggingface/transformers/pull/37506
| 2,994,446,749
|
PR_kwDOCUB6oc6Sj_IC
| 37,506
|
Added scikit-learn to the example image-classification requirements.txt
|
{
"login": "mylonjones",
"id": 70530658,
"node_id": "MDQ6VXNlcjcwNTMwNjU4",
"avatar_url": "https://avatars.githubusercontent.com/u/70530658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mylonjones",
"html_url": "https://github.com/mylonjones",
"followers_url": "https://api.github.com/users/mylonjones/followers",
"following_url": "https://api.github.com/users/mylonjones/following{/other_user}",
"gists_url": "https://api.github.com/users/mylonjones/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mylonjones/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mylonjones/subscriptions",
"organizations_url": "https://api.github.com/users/mylonjones/orgs",
"repos_url": "https://api.github.com/users/mylonjones/repos",
"events_url": "https://api.github.com/users/mylonjones/events{/privacy}",
"received_events_url": "https://api.github.com/users/mylonjones/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T22:49:15
| 2025-06-24T13:24:03
| 2025-06-24T13:24:03
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37506",
"html_url": "https://github.com/huggingface/transformers/pull/37506",
"diff_url": "https://github.com/huggingface/transformers/pull/37506.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37506.patch",
"merged_at": "2025-06-24T13:24:03"
}
|
# Fix dependency in examples/pytorch/image-classification/requirements.txt
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
I ran into an error when trying to run the image-classification example with the following program
`transformers\examples\pytorch\image-classification\run_image_classification.py`
First I ran `pip install -r requirements.txt` and tried to run the program using the example in the README.md
```
python run_image_classification.py --dataset_name beans --output_dir ./beans_outputs/ --remove_unused_columns False --label_column_name labels --do_train --do_eval --push_to_hub --push_to_hub_model_id vit-base-beans --learning_rate 2e-5 --num_train_epochs 5 --per_device_train_batch_size 8 --per_device_eval_batch_size 8 --logging_strategy steps --logging_steps 10 --eval_strategy epoch --save_strategy epoch --load_best_model_at_end True --save_total_limit 3 --seed 1337
```
Then I got this error.
```
Traceback (most recent call last):
File "C:\Users\mylon\transformers\examples\pytorch\image-classification\run_image_classification.py", line 441, in <module>
main()
~~~~^^
File "C:\Users\mylon\transformers\examples\pytorch\image-classification\run_image_classification.py", line 294, in main
metric = evaluate.load("accuracy", cache_dir=model_args.cache_dir)
File "C:\Users\mylon\.venv\Lib\site-packages\evaluate\loading.py", line 751, in load
evaluation_cls = import_main_class(evaluation_module.module_path)
File "C:\Users\mylon\.venv\Lib\site-packages\evaluate\loading.py", line 76, in import_main_class
module = importlib.import_module(module_path)
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.13_3.13.1008.0_x64__qbz5n2kfra8p0\Lib\importlib\__init__.py", line 88, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 1026, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "C:\Users\mylon\.cache\huggingface\modules\evaluate_modules\metrics\evaluate-metric--accuracy\f887c0aab52c2d38e1f8a215681126379eca617f96c447638f751434e8e65b14\accuracy.py", line 17, in <module>
from sklearn.metrics import accuracy_score
ModuleNotFoundError: No module named 'sklearn'
```
So I ran `pip install sklearn` and got this error.
```
Collecting sklearn
Using cached sklearn-0.0.post12.tar.gz (2.6 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [15 lines of output]
The 'sklearn' PyPI package is deprecated, use 'scikit-learn'
rather than 'sklearn' for pip commands.
Here is how to fix this error in the main use cases:
- use 'pip install scikit-learn' rather than 'pip install sklearn'
- replace 'sklearn' by 'scikit-learn' in your pip requirements files
(requirements.txt, setup.py, setup.cfg, Pipfile, etc ...)
- if the 'sklearn' package is used by one of your dependencies,
it would be great if you take some time to track which package uses
'sklearn' instead of 'scikit-learn' and report it to their issue tracker
- as a last resort, set the environment variable
SKLEARN_ALLOW_DEPRECATED_SKLEARN_PACKAGE_INSTALL=True to avoid this error
More information is available at
https://github.com/scikit-learn/sklearn-pypi-package
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
```
So finally I added scikit-learn to the requirements.txt file and installed it and now the program works just fine.
Here are my system details.
- `transformers` version: 4.52.0.dev0
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.13.3
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cpu (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: NO
I would like to make this contribution because I think it would people some time when trying out this example program.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37506/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37506/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37505
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37505/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37505/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37505/events
|
https://github.com/huggingface/transformers/issues/37505
| 2,994,445,992
|
I_kwDOCUB6oc6ye56o
| 37,505
|
Tensor parallel support for LLM training.
|
{
"login": "czkkkkkk",
"id": 17905585,
"node_id": "MDQ6VXNlcjE3OTA1NTg1",
"avatar_url": "https://avatars.githubusercontent.com/u/17905585?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/czkkkkkk",
"html_url": "https://github.com/czkkkkkk",
"followers_url": "https://api.github.com/users/czkkkkkk/followers",
"following_url": "https://api.github.com/users/czkkkkkk/following{/other_user}",
"gists_url": "https://api.github.com/users/czkkkkkk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/czkkkkkk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/czkkkkkk/subscriptions",
"organizations_url": "https://api.github.com/users/czkkkkkk/orgs",
"repos_url": "https://api.github.com/users/czkkkkkk/repos",
"events_url": "https://api.github.com/users/czkkkkkk/events{/privacy}",
"received_events_url": "https://api.github.com/users/czkkkkkk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-14T22:48:42
| 2025-04-15T06:01:04
| null |
NONE
| null | null | null | null |
### Feature request
Hi HF team,
I'm wondering about the current status of tensor parallelism (TP) support in Hugging Face. I've noticed that some standard models, such as llama4 and mixtral, include TP sharding plans, and `.from_pretrained` appears to support [loading models with a TP plan](https://github.com/huggingface/transformers/blob/4e63a1747ce6a4b5f75e8d2318857c2b76c3ba23/src/transformers/integrations/tensor_parallel.py#L620). So it seems that TP is supported for inference.
However, I'm curious about training support. Does the transformers library support TP combined with data parallelism (DP) during training? Also, it looks like `.save_pretrained` doesn't currently support saving TP-sharded models—can you confirm if that's the case, or if there's a workaround?
Thanks in advance!
### Motivation
To support large-scale LLM training with TP.
### Your contribution
Happy to contribute if there is a specific way to support TP+DP training.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37505/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37505/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37504
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37504/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37504/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37504/events
|
https://github.com/huggingface/transformers/issues/37504
| 2,994,221,540
|
I_kwDOCUB6oc6yeDHk
| 37,504
|
4.51.3 is much faster than prevous version - do you see the same?
|
{
"login": "Oxi84",
"id": 25420033,
"node_id": "MDQ6VXNlcjI1NDIwMDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/25420033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Oxi84",
"html_url": "https://github.com/Oxi84",
"followers_url": "https://api.github.com/users/Oxi84/followers",
"following_url": "https://api.github.com/users/Oxi84/following{/other_user}",
"gists_url": "https://api.github.com/users/Oxi84/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Oxi84/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Oxi84/subscriptions",
"organizations_url": "https://api.github.com/users/Oxi84/orgs",
"repos_url": "https://api.github.com/users/Oxi84/repos",
"events_url": "https://api.github.com/users/Oxi84/events{/privacy}",
"received_events_url": "https://api.github.com/users/Oxi84/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T21:08:19
| 2025-05-24T08:02:25
| 2025-05-24T08:02:25
|
NONE
| null | null | null | null |
### System Info
I tested on a script for t5 and the speed I see is almost 2x faster than before with 4.50?
Is there any significant change?
Thanks
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
speed test
t5 base
### Expected behavior
speed the same
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37504/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37504/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37503
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37503/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37503/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37503/events
|
https://github.com/huggingface/transformers/pull/37503
| 2,994,048,334
|
PR_kwDOCUB6oc6Sil94
| 37,503
|
Adding BitNet b1.58 Model
|
{
"login": "shumingma",
"id": 8328033,
"node_id": "MDQ6VXNlcjgzMjgwMzM=",
"avatar_url": "https://avatars.githubusercontent.com/u/8328033?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shumingma",
"html_url": "https://github.com/shumingma",
"followers_url": "https://api.github.com/users/shumingma/followers",
"following_url": "https://api.github.com/users/shumingma/following{/other_user}",
"gists_url": "https://api.github.com/users/shumingma/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shumingma/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shumingma/subscriptions",
"organizations_url": "https://api.github.com/users/shumingma/orgs",
"repos_url": "https://api.github.com/users/shumingma/repos",
"events_url": "https://api.github.com/users/shumingma/events{/privacy}",
"received_events_url": "https://api.github.com/users/shumingma/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T19:45:16
| 2025-04-29T09:43:16
| 2025-04-29T09:43:16
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37503",
"html_url": "https://github.com/huggingface/transformers/pull/37503",
"diff_url": "https://github.com/huggingface/transformers/pull/37503.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37503.patch",
"merged_at": null
}
|
# What does this PR do?
This PR adds the support of codes for the coming BitNet b1.58 model from Microsoft. @ArthurZucker
https://arxiv.org/abs/2402.17764
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37503/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37503/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37502
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37502/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37502/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37502/events
|
https://github.com/huggingface/transformers/issues/37502
| 2,993,925,236
|
I_kwDOCUB6oc6yc6x0
| 37,502
|
Add resume checkpoint support to ClearML callback
|
{
"login": "aarbelle",
"id": 14240313,
"node_id": "MDQ6VXNlcjE0MjQwMzEz",
"avatar_url": "https://avatars.githubusercontent.com/u/14240313?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aarbelle",
"html_url": "https://github.com/aarbelle",
"followers_url": "https://api.github.com/users/aarbelle/followers",
"following_url": "https://api.github.com/users/aarbelle/following{/other_user}",
"gists_url": "https://api.github.com/users/aarbelle/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aarbelle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aarbelle/subscriptions",
"organizations_url": "https://api.github.com/users/aarbelle/orgs",
"repos_url": "https://api.github.com/users/aarbelle/repos",
"events_url": "https://api.github.com/users/aarbelle/events{/privacy}",
"received_events_url": "https://api.github.com/users/aarbelle/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-14T18:55:29
| 2025-04-20T05:58:07
| null |
NONE
| null | null | null | null |
### Feature request
Link ClearML task when resuming from checkpoint
### Motivation
Currently, when resuming from checkpoint Clearml overwrites the task and starts a new one. A lot of the data and history is lost.
### Your contribution
I will try to see if I can fix it and submit a PR
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37502/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37502/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37501
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37501/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37501/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37501/events
|
https://github.com/huggingface/transformers/pull/37501
| 2,993,652,205
|
PR_kwDOCUB6oc6ShOkR
| 37,501
|
Change default value of `attn_temperature_tuning`
|
{
"login": "gmlwns2000",
"id": 4879345,
"node_id": "MDQ6VXNlcjQ4NzkzNDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4879345?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gmlwns2000",
"html_url": "https://github.com/gmlwns2000",
"followers_url": "https://api.github.com/users/gmlwns2000/followers",
"following_url": "https://api.github.com/users/gmlwns2000/following{/other_user}",
"gists_url": "https://api.github.com/users/gmlwns2000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gmlwns2000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gmlwns2000/subscriptions",
"organizations_url": "https://api.github.com/users/gmlwns2000/orgs",
"repos_url": "https://api.github.com/users/gmlwns2000/repos",
"events_url": "https://api.github.com/users/gmlwns2000/events{/privacy}",
"received_events_url": "https://api.github.com/users/gmlwns2000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T17:03:13
| 2025-04-15T10:10:39
| 2025-04-15T10:10:38
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37501",
"html_url": "https://github.com/huggingface/transformers/pull/37501",
"diff_url": "https://github.com/huggingface/transformers/pull/37501.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37501.patch",
"merged_at": "2025-04-15T10:10:38"
}
|
# What does this PR do?
Fixes #37479
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@Cyrilvallez @Rocketknight1
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37501/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37501/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37500
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37500/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37500/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37500/events
|
https://github.com/huggingface/transformers/pull/37500
| 2,993,580,988
|
PR_kwDOCUB6oc6Sg-0z
| 37,500
|
Don't auto-assign reviewers when the author is in HF
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T16:33:38
| 2025-04-14T17:17:40
| 2025-04-14T17:17:39
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37500",
"html_url": "https://github.com/huggingface/transformers/pull/37500",
"diff_url": "https://github.com/huggingface/transformers/pull/37500.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37500.patch",
"merged_at": "2025-04-14T17:17:39"
}
|
As the title says, we don't want to auto-assign reviewers for staff authors because they can do it themselves (and probably more accurately)
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37500/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37500/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37499
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37499/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37499/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37499/events
|
https://github.com/huggingface/transformers/pull/37499
| 2,993,497,747
|
PR_kwDOCUB6oc6Sgsrg
| 37,499
|
Fix broken add-fast-image-processor CLI
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T15:57:09
| 2025-04-15T16:50:22
| 2025-04-15T16:50:22
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37499",
"html_url": "https://github.com/huggingface/transformers/pull/37499",
"diff_url": "https://github.com/huggingface/transformers/pull/37499.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37499.patch",
"merged_at": "2025-04-15T16:50:21"
}
|
# What does this PR do?
Following the refactor of the transformers __init__ file and dummy objects, some actions in the add-fast-image-processor CLI are not needed anymore.
Thanks @cjfghk5697 for flagging this in https://github.com/huggingface/transformers/pull/37496
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37499/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37499/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37498
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37498/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37498/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37498/events
|
https://github.com/huggingface/transformers/pull/37498
| 2,993,479,999
|
PR_kwDOCUB6oc6Sgo0X
| 37,498
|
:red_circle: Update CLIP vision attention to new attention interface
|
{
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T15:49:41
| 2025-04-28T13:08:20
| 2025-04-16T16:15:22
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37498",
"html_url": "https://github.com/huggingface/transformers/pull/37498",
"diff_url": "https://github.com/huggingface/transformers/pull/37498.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37498.patch",
"merged_at": "2025-04-16T16:15:22"
}
|
# What does this PR do?
As per title. Removes the old definitions of `ATTENTION_FUNCTIONS` used before and aligns with the current standard, similar to recent siglip changes by @qubvel . I mostly did it because it'll unbloat the `modular` version of molmo in #33962 .
I ported the changes to all models copying the CLIP backbone, fixed some outdated copied from and typos.
To note: this also will correctly set the attention weights returned to `None` if `output_attentions` is passed as `False`.
|
{
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37498/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37498/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37497
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37497/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37497/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37497/events
|
https://github.com/huggingface/transformers/pull/37497
| 2,993,452,617
|
PR_kwDOCUB6oc6Sgizn
| 37,497
|
Fix pixel attention mask padding in smolvlm
|
{
"login": "ManuelFay",
"id": 43467008,
"node_id": "MDQ6VXNlcjQzNDY3MDA4",
"avatar_url": "https://avatars.githubusercontent.com/u/43467008?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ManuelFay",
"html_url": "https://github.com/ManuelFay",
"followers_url": "https://api.github.com/users/ManuelFay/followers",
"following_url": "https://api.github.com/users/ManuelFay/following{/other_user}",
"gists_url": "https://api.github.com/users/ManuelFay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ManuelFay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ManuelFay/subscriptions",
"organizations_url": "https://api.github.com/users/ManuelFay/orgs",
"repos_url": "https://api.github.com/users/ManuelFay/repos",
"events_url": "https://api.github.com/users/ManuelFay/events{/privacy}",
"received_events_url": "https://api.github.com/users/ManuelFay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T15:38:22
| 2025-04-16T18:48:47
| 2025-04-16T18:48:47
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37497",
"html_url": "https://github.com/huggingface/transformers/pull/37497",
"diff_url": "https://github.com/huggingface/transformers/pull/37497.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37497.patch",
"merged_at": "2025-04-16T18:48:47"
}
|
# What does this PR do?
The attention mask (binary) is initialized with zeros in float64, while in make_pixel_mask, the ones and zeros are in int64.
```python
def make_pixel_mask(
image: np.ndarray, output_size: Tuple[int, int], input_data_format: Optional[Union[str, ChannelDimension]] = None
) -> np.ndarray:
"""
Make a pixel mask for the image, where 1 indicates a valid pixel and 0 indicates padding.
Args:
image (`np.ndarray`):
Image to make the pixel mask for.
output_size (`Tuple[int, int]`):
Output size of the mask.
"""
input_height, input_width = get_image_size(image, channel_dim=input_data_format)
mask = np.zeros(output_size, dtype=np.int64)
mask[:input_height, :input_width] = 1
return mask
```
Fixes # (issue)
This causes either int64, or float64 mask values depending on if inputs are padded.
This makes MPS devices bug since they don't support float64.
By the way @andimarafioti , we could just cast as np.bool if masks are binary but I guess there is a reason it was done in np.int64 ?
## Who can review?
@orrzohar @andimarafioti @zucchini-nlp
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37497/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37497/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37496
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37496/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37496/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37496/events
|
https://github.com/huggingface/transformers/pull/37496
| 2,993,170,677
|
PR_kwDOCUB6oc6SflKy
| 37,496
|
Fix IndexError in add_import_statement_init
|
{
"login": "cjfghk5697",
"id": 80466735,
"node_id": "MDQ6VXNlcjgwNDY2NzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/80466735?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cjfghk5697",
"html_url": "https://github.com/cjfghk5697",
"followers_url": "https://api.github.com/users/cjfghk5697/followers",
"following_url": "https://api.github.com/users/cjfghk5697/following{/other_user}",
"gists_url": "https://api.github.com/users/cjfghk5697/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cjfghk5697/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cjfghk5697/subscriptions",
"organizations_url": "https://api.github.com/users/cjfghk5697/orgs",
"repos_url": "https://api.github.com/users/cjfghk5697/repos",
"events_url": "https://api.github.com/users/cjfghk5697/events{/privacy}",
"received_events_url": "https://api.github.com/users/cjfghk5697/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T13:59:46
| 2025-04-16T03:15:09
| 2025-04-16T03:15:09
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37496",
"html_url": "https://github.com/huggingface/transformers/pull/37496",
"diff_url": "https://github.com/huggingface/transformers/pull/37496.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37496.patch",
"merged_at": null
}
|
# What does this PR do?
#36978
This PR fixes an issue encountered when running the command:
```bash
transformers-cli add-fast-image-processor --model-name bridgetower
```
During BridgeTower fast image processor integration, an error occurred due to the indent calculation logic in the function `add_import_statement_init`. The original code assumed that at least two lines exist in the block it is processing:
```python
indent = " " * (len(lines[1]) - len(lines[1].lstrip()))
```
In some cases (for example, when the block is very short), `lines[1]` did not exist, which resulted in an `IndexError`.
To address this, the code now checks the length of the `lines` list and computes the indent conditionally. If there is more than one line, it uses the second line; if there is only one line, it uses that one; and if the block is empty, it defaults to an empty indent.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Details
The change in the indent calculation is as follows:
```python
if len(lines) > 1:
indent = " " * (len(lines[1]) - len(lines[1].lstrip()))
elif lines:
indent = " " * (len(lines[0]) - len(lines[0].lstrip()))
else:
indent = ""
```
This fix ensures that the function correctly computes the indent regardless of the number of lines present in the import block. This error was discovered while working on BridgeTower and has been verified to resolve the `IndexError`.
## Who can review?
Anyone in the community familiar with the CLI commands or the internal code of the Transformers repository is welcome to review this change.
|
{
"login": "cjfghk5697",
"id": 80466735,
"node_id": "MDQ6VXNlcjgwNDY2NzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/80466735?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cjfghk5697",
"html_url": "https://github.com/cjfghk5697",
"followers_url": "https://api.github.com/users/cjfghk5697/followers",
"following_url": "https://api.github.com/users/cjfghk5697/following{/other_user}",
"gists_url": "https://api.github.com/users/cjfghk5697/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cjfghk5697/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cjfghk5697/subscriptions",
"organizations_url": "https://api.github.com/users/cjfghk5697/orgs",
"repos_url": "https://api.github.com/users/cjfghk5697/repos",
"events_url": "https://api.github.com/users/cjfghk5697/events{/privacy}",
"received_events_url": "https://api.github.com/users/cjfghk5697/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37496/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37496/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37495
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37495/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37495/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37495/events
|
https://github.com/huggingface/transformers/issues/37495
| 2,993,160,476
|
I_kwDOCUB6oc6yaAEc
| 37,495
|
Refactor bert-based models to use global attention function
|
{
"login": "Marcel256",
"id": 11369824,
"node_id": "MDQ6VXNlcjExMzY5ODI0",
"avatar_url": "https://avatars.githubusercontent.com/u/11369824?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Marcel256",
"html_url": "https://github.com/Marcel256",
"followers_url": "https://api.github.com/users/Marcel256/followers",
"following_url": "https://api.github.com/users/Marcel256/following{/other_user}",
"gists_url": "https://api.github.com/users/Marcel256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Marcel256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Marcel256/subscriptions",
"organizations_url": "https://api.github.com/users/Marcel256/orgs",
"repos_url": "https://api.github.com/users/Marcel256/repos",
"events_url": "https://api.github.com/users/Marcel256/events{/privacy}",
"received_events_url": "https://api.github.com/users/Marcel256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-14T13:56:11
| 2025-05-23T07:29:38
| null |
NONE
| null | null | null | null |
### Feature request
Refactoring of the attention modules in bert-based models to use global attention function
### Motivation
Enabling easier support of SDPA and flash attention while minimizing code duplication in Bert copies
### Your contribution
I already created a draft PR #37494 to outline the changes required. Would love to get feedback and would continue working on this PR if needed
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37495/reactions",
"total_count": 8,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 5,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37495/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37494
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37494/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37494/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37494/events
|
https://github.com/huggingface/transformers/pull/37494
| 2,993,121,992
|
PR_kwDOCUB6oc6Sfam0
| 37,494
|
[WIP] Refactor attention modules in Bert-based models to use global attention functions
|
{
"login": "Marcel256",
"id": 11369824,
"node_id": "MDQ6VXNlcjExMzY5ODI0",
"avatar_url": "https://avatars.githubusercontent.com/u/11369824?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Marcel256",
"html_url": "https://github.com/Marcel256",
"followers_url": "https://api.github.com/users/Marcel256/followers",
"following_url": "https://api.github.com/users/Marcel256/following{/other_user}",
"gists_url": "https://api.github.com/users/Marcel256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Marcel256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Marcel256/subscriptions",
"organizations_url": "https://api.github.com/users/Marcel256/orgs",
"repos_url": "https://api.github.com/users/Marcel256/repos",
"events_url": "https://api.github.com/users/Marcel256/events{/privacy}",
"received_events_url": "https://api.github.com/users/Marcel256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-14T13:42:25
| 2025-04-14T13:42:25
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37494",
"html_url": "https://github.com/huggingface/transformers/pull/37494",
"diff_url": "https://github.com/huggingface/transformers/pull/37494.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37494.patch",
"merged_at": null
}
|
# What does this PR do?
Refactoring of the Attention Modules in bert-based models to use global attention functions.
Allows support of SDPA and flash attention for Bert models and potentially flex attention in the future
This is a first draft (Feedback appreciated)
Could solve:
#37105
#29129
#27957
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37494/reactions",
"total_count": 4,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37494/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37493
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37493/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37493/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37493/events
|
https://github.com/huggingface/transformers/pull/37493
| 2,993,035,391
|
PR_kwDOCUB6oc6SfHqw
| 37,493
|
[qwen-omni] fix processor
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T13:11:22
| 2025-04-14T15:30:32
| 2025-04-14T15:30:31
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37493",
"html_url": "https://github.com/huggingface/transformers/pull/37493",
"diff_url": "https://github.com/huggingface/transformers/pull/37493.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37493.patch",
"merged_at": "2025-04-14T15:30:31"
}
|
# What does this PR do?
Fixes #37491 and deflakes a test
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37493/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37493/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37492
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37492/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37492/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37492/events
|
https://github.com/huggingface/transformers/issues/37492
| 2,992,990,529
|
I_kwDOCUB6oc6yZWlB
| 37,492
|
module 'transformers_modules.DeepSeek-V3-BF16.configuration_deepseek' has no attribute 'DeepseekV3Config'
|
{
"login": "chenxiaodong2002",
"id": 157867681,
"node_id": "U_kgDOCWjeoQ",
"avatar_url": "https://avatars.githubusercontent.com/u/157867681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chenxiaodong2002",
"html_url": "https://github.com/chenxiaodong2002",
"followers_url": "https://api.github.com/users/chenxiaodong2002/followers",
"following_url": "https://api.github.com/users/chenxiaodong2002/following{/other_user}",
"gists_url": "https://api.github.com/users/chenxiaodong2002/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chenxiaodong2002/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chenxiaodong2002/subscriptions",
"organizations_url": "https://api.github.com/users/chenxiaodong2002/orgs",
"repos_url": "https://api.github.com/users/chenxiaodong2002/repos",
"events_url": "https://api.github.com/users/chenxiaodong2002/events{/privacy}",
"received_events_url": "https://api.github.com/users/chenxiaodong2002/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T12:54:40
| 2025-06-18T06:14:54
| 2025-04-30T03:42:15
|
NONE
| null | null | null | null |
### System Info
transformers==4.51.2
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I used the following code and commands to load DeepSeek-V3 on 4*8 A100
```
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained('./DeepSeek-V3-BF16', trust_remote_code=True, device_map='auto', torch_dtype=torch.bfloat16)
```
`torchrun --nproc_per_node=8 --nnodes=4 node_rank=${RANK} --rdzv-id deepseek --rdzv-backend c10d --rdzv-endpoint ${MASTER_ADDR}:${MASTER_PORT} test.py`
but encountered the following error:
```
model = AutoModelForCausalLM.from_pretrained('./DeepSeek-V3-BF16', trust_remote_code=True, device_map="auto", torch_dtype=torch.bfloat16)
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1063, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
config, kwargs = AutoConfig.from_pretrained(
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1063, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1063, in from_pretrained
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1063, in from_pretrained
config_class = get_class_from_dynamic_module(
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 553, in get_class_from_dynamic_module
config_class = get_class_from_dynamic_module(config_class = get_class_from_dynamic_module(
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 553, in get_class_from_dynamic_module
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 553, in get_class_from_dynamic_module
config_class = get_class_from_dynamic_module(
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 553, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module, force_reload=force_download)
return get_class_in_module(class_name, final_module, force_reload=force_download)
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 252, in get_class_in_module
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 252, in get_class_in_module
return get_class_in_module(class_name, final_module, force_reload=force_download)
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 252, in get_class_in_module
return get_class_in_module(class_name, final_module, force_reload=force_download)
File "/opt/conda/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 252, in get_class_in_module
return getattr(module, class_name)
return getattr(module, class_name)return getattr(module, class_name)
AttributeError: module 'transformers_modules.DeepSeek-V3-BF16.configuration_deepseek' has no attribute 'DeepseekV3Config'
```
However, when I launch it using `python test.py`, the model can load normally although the speed is quite slow. Would anyone be able to help me resolve this issue?
### Expected behavior
n/a
|
{
"login": "chenxiaodong2002",
"id": 157867681,
"node_id": "U_kgDOCWjeoQ",
"avatar_url": "https://avatars.githubusercontent.com/u/157867681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chenxiaodong2002",
"html_url": "https://github.com/chenxiaodong2002",
"followers_url": "https://api.github.com/users/chenxiaodong2002/followers",
"following_url": "https://api.github.com/users/chenxiaodong2002/following{/other_user}",
"gists_url": "https://api.github.com/users/chenxiaodong2002/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chenxiaodong2002/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chenxiaodong2002/subscriptions",
"organizations_url": "https://api.github.com/users/chenxiaodong2002/orgs",
"repos_url": "https://api.github.com/users/chenxiaodong2002/repos",
"events_url": "https://api.github.com/users/chenxiaodong2002/events{/privacy}",
"received_events_url": "https://api.github.com/users/chenxiaodong2002/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37492/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37492/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37491
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37491/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37491/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37491/events
|
https://github.com/huggingface/transformers/issues/37491
| 2,992,907,756
|
I_kwDOCUB6oc6yZCXs
| 37,491
|
[BUG] Qwen2.5-Omni-7B processor numpy view error.
|
{
"login": "kobenaxie",
"id": 22359441,
"node_id": "MDQ6VXNlcjIyMzU5NDQx",
"avatar_url": "https://avatars.githubusercontent.com/u/22359441?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kobenaxie",
"html_url": "https://github.com/kobenaxie",
"followers_url": "https://api.github.com/users/kobenaxie/followers",
"following_url": "https://api.github.com/users/kobenaxie/following{/other_user}",
"gists_url": "https://api.github.com/users/kobenaxie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kobenaxie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kobenaxie/subscriptions",
"organizations_url": "https://api.github.com/users/kobenaxie/orgs",
"repos_url": "https://api.github.com/users/kobenaxie/repos",
"events_url": "https://api.github.com/users/kobenaxie/events{/privacy}",
"received_events_url": "https://api.github.com/users/kobenaxie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T12:22:20
| 2025-04-14T15:30:32
| 2025-04-14T15:30:32
|
NONE
| null | null | null | null |
https://github.com/huggingface/transformers/blob/4b8c6d4cf8c779bf0895deb980669f5b2cb5d182/src/transformers/models/qwen2_5_omni/processing_qwen2_5_omni.py#L247
This bug was introduced by #https://github.com/huggingface/transformers/pull/36752/commits/605da8121c4a708102ce04d7092eb61cf7229e8a
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37491/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37491/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37490
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37490/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37490/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37490/events
|
https://github.com/huggingface/transformers/pull/37490
| 2,992,793,467
|
PR_kwDOCUB6oc6SeSsM
| 37,490
|
Refactor torchao docs
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T11:37:20
| 2025-04-17T18:09:49
| 2025-04-16T12:56:48
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37490",
"html_url": "https://github.com/huggingface/transformers/pull/37490",
"diff_url": "https://github.com/huggingface/transformers/pull/37490.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37490.patch",
"merged_at": "2025-04-16T12:56:48"
}
|
# What does this PR do?
Refactoring torchao docs, with clear doc layout, better and exhaustive examples, benchmarks and more !
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37490/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37490/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37489
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37489/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37489/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37489/events
|
https://github.com/huggingface/transformers/pull/37489
| 2,992,757,190
|
PR_kwDOCUB6oc6SeKuX
| 37,489
|
[ci] fix doc builder
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T11:21:29
| 2025-04-14T11:49:31
| 2025-04-14T11:49:31
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37489",
"html_url": "https://github.com/huggingface/transformers/pull/37489",
"diff_url": "https://github.com/huggingface/transformers/pull/37489.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37489.patch",
"merged_at": "2025-04-14T11:49:31"
}
|
# What does this PR do?
Merging qwen omni broke CI, this PR fixes it
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37489/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37489/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37488
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37488/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37488/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37488/events
|
https://github.com/huggingface/transformers/issues/37488
| 2,992,700,671
|
I_kwDOCUB6oc6yYPz_
| 37,488
|
Fast Image Processor for EfficientNet: Deprecated folder issue
|
{
"login": "Kim-Ju-won",
"id": 81630351,
"node_id": "MDQ6VXNlcjgxNjMwMzUx",
"avatar_url": "https://avatars.githubusercontent.com/u/81630351?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kim-Ju-won",
"html_url": "https://github.com/Kim-Ju-won",
"followers_url": "https://api.github.com/users/Kim-Ju-won/followers",
"following_url": "https://api.github.com/users/Kim-Ju-won/following{/other_user}",
"gists_url": "https://api.github.com/users/Kim-Ju-won/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kim-Ju-won/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kim-Ju-won/subscriptions",
"organizations_url": "https://api.github.com/users/Kim-Ju-won/orgs",
"repos_url": "https://api.github.com/users/Kim-Ju-won/repos",
"events_url": "https://api.github.com/users/Kim-Ju-won/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kim-Ju-won/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T10:57:41
| 2025-04-14T13:24:04
| 2025-04-14T13:04:11
|
CONTRIBUTOR
| null | null | null | null |
First of all, I deeply appreciate the opportunity to contribute, as discussed in [issue #36978](https://github.com/huggingface/transformers/issues/36978).
I left a comment there mentioning that I’d like to contribute a Fast image processor for EfficientFormer.
I tried running the following command from the guide:
```bash
transformers-cli add-fast-image-processor --model-name efficientformer
```
However, I encountered the following error:
```
2025-04-14 10:33:50.966210: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Traceback (most recent call last):
File "/usr/local/bin/transformers-cli", line 8, in <module>
sys.exit(main())
File "/root/transformers/src/transformers/commands/transformers_cli.py", line 50, in main
service.run()
File "/root/transformers/src/transformers/commands/add_fast_image_processor.py", line 677, in run
add_fast_image_processor(model_name=self.model_name)
File "/root/transformers/src/transformers/commands/add_fast_image_processor.py", line 597, in add_fast_image_processor
raise ValueError(f"No image processing module found in {model_module}")
ValueError: No image processing module found in /root/transformers/src/transformers/models/model_name
```
After checking, I found that the issue seems to be due to the efficientformer model being located in the deprecated subdirectory, rather than directly under the models folder like other actively supported models.
<img width="1337" alt="Image" src="https://github.com/user-attachments/assets/d4a96b0f-274f-4bc9-957b-b08cb9b7b3d5" />
And also, I looked into the [EfficientFormer model documentation](https://huggingface.co/docs/transformers/model_doc/efficientformer), and found the following note:
> This model is in maintenance mode only, we don’t accept any new PRs changing its code. If you run into any issues running this model, please reinstall the last version that supported this model: v4.40.2. You can do so by running the following command: pip install -U transformers==4.40.2.
So I’m wondering:
- Is it possible to add a Fast image processor for EfficientFormer even though it’s in the deprecated folder?
- Should I move it to the main models directory, or keep working inside deprecated?
- Or, as mentioned in the documentation, is this model no longer open for contributions beyond v4.40.2?
Thank you very much for your time and support.
Wishing you a peaceful and warm day ahead!
|
{
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37488/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37488/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37487
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37487/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37487/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37487/events
|
https://github.com/huggingface/transformers/pull/37487
| 2,992,605,774
|
PR_kwDOCUB6oc6SdpJ9
| 37,487
|
fix: :bug: Support explicitly passing callback
|
{
"login": "moyueheng",
"id": 54298540,
"node_id": "MDQ6VXNlcjU0Mjk4NTQw",
"avatar_url": "https://avatars.githubusercontent.com/u/54298540?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/moyueheng",
"html_url": "https://github.com/moyueheng",
"followers_url": "https://api.github.com/users/moyueheng/followers",
"following_url": "https://api.github.com/users/moyueheng/following{/other_user}",
"gists_url": "https://api.github.com/users/moyueheng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/moyueheng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/moyueheng/subscriptions",
"organizations_url": "https://api.github.com/users/moyueheng/orgs",
"repos_url": "https://api.github.com/users/moyueheng/repos",
"events_url": "https://api.github.com/users/moyueheng/events{/privacy}",
"received_events_url": "https://api.github.com/users/moyueheng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-14T10:19:29
| 2025-06-28T06:12:37
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37487",
"html_url": "https://github.com/huggingface/transformers/pull/37487",
"diff_url": "https://github.com/huggingface/transformers/pull/37487.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37487.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37487/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37487/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37486
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37486/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37486/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37486/events
|
https://github.com/huggingface/transformers/issues/37486
| 2,992,452,533
|
I_kwDOCUB6oc6yXTO1
| 37,486
|
Weights of BlipModel are not initialized from the model checkpoint
|
{
"login": "sumitmishra209",
"id": 36421656,
"node_id": "MDQ6VXNlcjM2NDIxNjU2",
"avatar_url": "https://avatars.githubusercontent.com/u/36421656?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sumitmishra209",
"html_url": "https://github.com/sumitmishra209",
"followers_url": "https://api.github.com/users/sumitmishra209/followers",
"following_url": "https://api.github.com/users/sumitmishra209/following{/other_user}",
"gists_url": "https://api.github.com/users/sumitmishra209/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sumitmishra209/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sumitmishra209/subscriptions",
"organizations_url": "https://api.github.com/users/sumitmishra209/orgs",
"repos_url": "https://api.github.com/users/sumitmishra209/repos",
"events_url": "https://api.github.com/users/sumitmishra209/events{/privacy}",
"received_events_url": "https://api.github.com/users/sumitmishra209/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T09:25:02
| 2025-04-14T10:17:29
| 2025-04-14T10:17:29
|
NONE
| null | null | null | null |
### System Info
The code snippet is an example from https://huggingface.co/docs/transformers/model_doc/blip#transformers.BlipModel.get_text_features.example.
The warning that I get is:
Some weights of BlipTextModel were not initialized from the model checkpoint at Salesforce/blip-image-captioning-base and are newly initialized: ['embeddings.LayerNorm.bias', 'embeddings.LayerNorm.weight', 'embeddings.position_embeddings.weight', 'embeddings.word_embeddings.weight', 'encoder.layer.0.attention.output.LayerNorm.bias', 'encoder.layer.0.attention.output.LayerNorm.weight', 'encoder.layer.0.attention.output.dense.bias', 'encoder.layer.0.attention.output.dense.weight',...
The model weights seem to be newly initialised as there's some error with loading the pre-trained weights. Please guide me in solving this issue. I am making a custom model in which I need the last hidden state of output with text as input (outputs = text_encoder(**inputs))
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from transformers import AutoProcessor, BlipModel
model = BlipModel.from_pretrained("Salesforce/blip-image-captioning-base")
processor = AutoProcessor.from_pretrained("Salesforce/blip-image-captioning-base")
inputs = processor(text=["a photo of a cat", "a photo of a dog"], padding=True, return_tensors="pt")
text_features = model.get_text_features(**inputs)
### Expected behavior
BlipModel.from_pretrained("Salesforce/blip-image-captioning-base")
this shoul get inialized with weights to run model.get_text_features(**inputs)
|
{
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37486/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37486/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37485
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37485/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37485/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37485/events
|
https://github.com/huggingface/transformers/pull/37485
| 2,992,415,381
|
PR_kwDOCUB6oc6Sc_b4
| 37,485
|
VDR task guide
|
{
"login": "merveenoyan",
"id": 53175384,
"node_id": "MDQ6VXNlcjUzMTc1Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53175384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/merveenoyan",
"html_url": "https://github.com/merveenoyan",
"followers_url": "https://api.github.com/users/merveenoyan/followers",
"following_url": "https://api.github.com/users/merveenoyan/following{/other_user}",
"gists_url": "https://api.github.com/users/merveenoyan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/merveenoyan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/merveenoyan/subscriptions",
"organizations_url": "https://api.github.com/users/merveenoyan/orgs",
"repos_url": "https://api.github.com/users/merveenoyan/repos",
"events_url": "https://api.github.com/users/merveenoyan/events{/privacy}",
"received_events_url": "https://api.github.com/users/merveenoyan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T09:13:09
| 2025-04-15T15:55:15
| 2025-04-15T15:55:14
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37485",
"html_url": "https://github.com/huggingface/transformers/pull/37485",
"diff_url": "https://github.com/huggingface/transformers/pull/37485.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37485.patch",
"merged_at": "2025-04-15T15:55:14"
}
|
Added VDR task guide as I noticed the model docs are too minimal for a solid use case
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37485/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37485/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37484
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37484/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37484/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37484/events
|
https://github.com/huggingface/transformers/pull/37484
| 2,992,346,139
|
PR_kwDOCUB6oc6ScweW
| 37,484
|
Fix tests failed with gated repos.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T08:48:54
| 2025-04-14T10:08:15
| 2025-04-14T10:08:13
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37484",
"html_url": "https://github.com/huggingface/transformers/pull/37484",
"diff_url": "https://github.com/huggingface/transformers/pull/37484.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37484.patch",
"merged_at": "2025-04-14T10:08:13"
}
|
# What does this PR do?
Fix tests failed with gated repos by adding `require_read_token`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37484/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37484/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37483
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37483/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37483/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37483/events
|
https://github.com/huggingface/transformers/pull/37483
| 2,992,237,564
|
PR_kwDOCUB6oc6ScZLG
| 37,483
|
Add callback to monitor progress in whisper transcription
|
{
"login": "poke1024",
"id": 11859538,
"node_id": "MDQ6VXNlcjExODU5NTM4",
"avatar_url": "https://avatars.githubusercontent.com/u/11859538?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/poke1024",
"html_url": "https://github.com/poke1024",
"followers_url": "https://api.github.com/users/poke1024/followers",
"following_url": "https://api.github.com/users/poke1024/following{/other_user}",
"gists_url": "https://api.github.com/users/poke1024/gists{/gist_id}",
"starred_url": "https://api.github.com/users/poke1024/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/poke1024/subscriptions",
"organizations_url": "https://api.github.com/users/poke1024/orgs",
"repos_url": "https://api.github.com/users/poke1024/repos",
"events_url": "https://api.github.com/users/poke1024/events{/privacy}",
"received_events_url": "https://api.github.com/users/poke1024/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
},
{
"id": 7377881103,
"node_id": "LA_kwDOCUB6oc8AAAABt8GIDw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Whisper",
"name": "Whisper",
"color": "83303E",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T08:05:42
| 2025-07-30T15:40:53
| 2025-07-30T15:40:53
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37483",
"html_url": "https://github.com/huggingface/transformers/pull/37483",
"diff_url": "https://github.com/huggingface/transformers/pull/37483.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37483.patch",
"merged_at": "2025-07-30T15:40:53"
}
|
This PR adds a callback in the `generate` function of `WhisperGenerationMixin` to give callers the ability to monitor progress for whisper transcriptions.
This is useful in settings where transcription happens in a notebook or UI settings, and callers want to provide users with a progress bar or similar feedback on the progress of long running calls (e.g. >1 minute).
Reviewer suggestion: @eustlb
|
{
"login": "ebezzam",
"id": 4757445,
"node_id": "MDQ6VXNlcjQ3NTc0NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4757445?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ebezzam",
"html_url": "https://github.com/ebezzam",
"followers_url": "https://api.github.com/users/ebezzam/followers",
"following_url": "https://api.github.com/users/ebezzam/following{/other_user}",
"gists_url": "https://api.github.com/users/ebezzam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ebezzam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ebezzam/subscriptions",
"organizations_url": "https://api.github.com/users/ebezzam/orgs",
"repos_url": "https://api.github.com/users/ebezzam/repos",
"events_url": "https://api.github.com/users/ebezzam/events{/privacy}",
"received_events_url": "https://api.github.com/users/ebezzam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37483/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37483/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37481
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37481/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37481/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37481/events
|
https://github.com/huggingface/transformers/pull/37481
| 2,992,089,467
|
PR_kwDOCUB6oc6Sb5Op
| 37,481
|
36978 | Fast image processor for DPT model
|
{
"login": "samrae7",
"id": 4126146,
"node_id": "MDQ6VXNlcjQxMjYxNDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4126146?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/samrae7",
"html_url": "https://github.com/samrae7",
"followers_url": "https://api.github.com/users/samrae7/followers",
"following_url": "https://api.github.com/users/samrae7/following{/other_user}",
"gists_url": "https://api.github.com/users/samrae7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/samrae7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/samrae7/subscriptions",
"organizations_url": "https://api.github.com/users/samrae7/orgs",
"repos_url": "https://api.github.com/users/samrae7/repos",
"events_url": "https://api.github.com/users/samrae7/events{/privacy}",
"received_events_url": "https://api.github.com/users/samrae7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T07:03:06
| 2025-06-18T17:37:40
| 2025-06-18T17:33:29
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37481",
"html_url": "https://github.com/huggingface/transformers/pull/37481",
"diff_url": "https://github.com/huggingface/transformers/pull/37481.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37481.patch",
"merged_at": "2025-06-18T17:33:29"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
Add Fast Image Processor for DPT model
<!-- Remove if not applicable -->
Fixes #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. [Link to issue](https://github.com/huggingface/transformers/issues/36978)
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37481/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37481/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37480
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37480/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37480/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37480/events
|
https://github.com/huggingface/transformers/pull/37480
| 2,991,898,114
|
PR_kwDOCUB6oc6SbPhd
| 37,480
|
make test_snowman_image_captioning pass on XPU, by sharing same atol w/ ROCM
|
{
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T05:10:47
| 2025-04-14T23:08:25
| 2025-04-14T09:39:46
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37480",
"html_url": "https://github.com/huggingface/transformers/pull/37480",
"diff_url": "https://github.com/huggingface/transformers/pull/37480.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37480.patch",
"merged_at": "2025-04-14T09:39:45"
}
|
`pytest -rA tests/models/kosmos2/test_modeling_kosmos2.py::Kosmos2ModelIntegrationTest::test_snowman_image_captioning` PASS.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37480/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37480/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37479
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37479/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37479/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37479/events
|
https://github.com/huggingface/transformers/issues/37479
| 2,991,868,037
|
I_kwDOCUB6oc6yVEiF
| 37,479
|
Mismatching default value of `Llama4TextConfig` `attn_temperature_tuning` between official llama code
|
{
"login": "gmlwns2000",
"id": 4879345,
"node_id": "MDQ6VXNlcjQ4NzkzNDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4879345?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gmlwns2000",
"html_url": "https://github.com/gmlwns2000",
"followers_url": "https://api.github.com/users/gmlwns2000/followers",
"following_url": "https://api.github.com/users/gmlwns2000/following{/other_user}",
"gists_url": "https://api.github.com/users/gmlwns2000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gmlwns2000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gmlwns2000/subscriptions",
"organizations_url": "https://api.github.com/users/gmlwns2000/orgs",
"repos_url": "https://api.github.com/users/gmlwns2000/repos",
"events_url": "https://api.github.com/users/gmlwns2000/events{/privacy}",
"received_events_url": "https://api.github.com/users/gmlwns2000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T04:46:46
| 2025-04-15T10:10:40
| 2025-04-15T10:10:40
|
CONTRIBUTOR
| null | null | null | null |
### System Info
In `transfomers` library, the default value of `attn_temperature_tuning` in `Llama4TextConfig`, is `4`.
https://github.com/huggingface/transformers/blob/953196a43dae6a3c474165fba7d215fcbc7b7730/src/transformers/models/llama4/configuration_llama4.py#L231
However, I think this value should be boolean because it is used as a condition flag in the forward pass.
https://github.com/huggingface/transformers/blob/953196a43dae6a3c474165fba7d215fcbc7b7730/src/transformers/models/llama4/modeling_llama4.py#L334
Moreover, in the official implementation, that value is boolean, which is defaulted to `False`. [(Offical Config File)](https://github.com/meta-llama/llama-models/blob/823cd8622e44d90b8e989e9f41ca364c06f5701d/models/llama4/args.py#L80)
### Who can help?
@ArthurZucker
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Any `Llama4TextConfig` instances.
### Expected behavior
The default value of `config.attn_temperature_tuning` should be `True` or `False` depending on input sequence length.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37479/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37479/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37478
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37478/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37478/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37478/events
|
https://github.com/huggingface/transformers/issues/37478
| 2,991,810,922
|
I_kwDOCUB6oc6yU2lq
| 37,478
|
The "force_words_ids" does not seem to be available on llama4
|
{
"login": "gzglss",
"id": 71701750,
"node_id": "MDQ6VXNlcjcxNzAxNzUw",
"avatar_url": "https://avatars.githubusercontent.com/u/71701750?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gzglss",
"html_url": "https://github.com/gzglss",
"followers_url": "https://api.github.com/users/gzglss/followers",
"following_url": "https://api.github.com/users/gzglss/following{/other_user}",
"gists_url": "https://api.github.com/users/gzglss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gzglss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gzglss/subscriptions",
"organizations_url": "https://api.github.com/users/gzglss/orgs",
"repos_url": "https://api.github.com/users/gzglss/repos",
"events_url": "https://api.github.com/users/gzglss/events{/privacy}",
"received_events_url": "https://api.github.com/users/gzglss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-14T03:58:26
| 2025-05-23T08:02:27
| 2025-05-23T08:02:27
|
NONE
| null | null | null | null |
### System Info
transformers=v4.51.1
torch=2.6.0
python=3.10.0
The code snippet is as follows:
output = model.generate(**input_ids,do_sample=False,num_beams=2, max_new_tokens=1,force_words_ids=[tokenizer.convert_tokens_to_ids(['A', 'B', 'C', 'D'])])
The error message is as follows:
[rank0]: File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama4/modeling_llama4.py", line 379, in forward
[rank0]: attn_output, attn_weights = attention_interface(
[rank0]: File "/usr/local/lib/python3.10/dist-packages/transformers/models/llama4/modeling_llama4.py", line 286, in eager_attention_forward
[rank0]: attn_weights = attn_weights + causal_mask
[rank0]: RuntimeError: The size of tensor a (8192) must match the size of tensor b (18) at non-singleton dimension 3
### code
```
from transformers import AutoTokenizer, Llama4ForConditionalGeneration
import torch
model_id = "...Llama-4-Scout-17B-16E-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_id)
messages = [
{"role": "user", "content": "Which one do you choose among ABCD?"},
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt", return_dict=True)
model = Llama4ForConditionalGeneration.from_pretrained(
model_id,
device_map="auto",
torch_dtype=torch.bfloat16,
)
output = model.generate(**inputs.to(model.device), max_new_tokens=1,num_beams=2,do_sample=False,force_words_ids=[tokenizer.convert_tokens_to_ids(['A', 'B', 'C', 'D'])])
outputs = tokenizer.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])
print(outputs[0])
```
### Expected behavior
Why does this problem occur and how can it be solved?
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37478/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37478/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37477
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37477/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37477/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37477/events
|
https://github.com/huggingface/transformers/issues/37477
| 2,991,560,410
|
I_kwDOCUB6oc6yT5ba
| 37,477
|
Unrecognized model in Qwen/Qwen2.5-Coder-7B-Instruct
|
{
"login": "kevinlu1248",
"id": 26889185,
"node_id": "MDQ6VXNlcjI2ODg5MTg1",
"avatar_url": "https://avatars.githubusercontent.com/u/26889185?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kevinlu1248",
"html_url": "https://github.com/kevinlu1248",
"followers_url": "https://api.github.com/users/kevinlu1248/followers",
"following_url": "https://api.github.com/users/kevinlu1248/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinlu1248/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kevinlu1248/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinlu1248/subscriptions",
"organizations_url": "https://api.github.com/users/kevinlu1248/orgs",
"repos_url": "https://api.github.com/users/kevinlu1248/repos",
"events_url": "https://api.github.com/users/kevinlu1248/events{/privacy}",
"received_events_url": "https://api.github.com/users/kevinlu1248/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-13T23:47:11
| 2025-07-08T11:31:09
| 2025-04-15T12:54:12
|
NONE
| null | null | null | null |
### System Info
I bumped my transformers versions to `4.51.2` recently and Qwen 2.5 Coder no longer loads. I downgraded back to `4.49.0` and it loads properly. I think it's a regression.
```
ValueError: Unrecognized model in Qwen/Qwen2.5-Coder-7B-Instruct. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: albert, align, altclip, aria, aria_text, audio-spectrogram-transformer, autoformer, aya_vision, bamba, bark, bart, beit, bert, bert-generation, big_bird, bigbird_pegasus, biogpt, bit, blenderbot, blenderbot-small, blip, blip-2, bloom, bridgetower, bros, camembert, canine, chameleon, chinese_clip, chinese_clip_vision_model, clap, clip, clip_text_model, clip_vision_model, clipseg, clvp, code_llama, codegen, cohere, cohere2, colpali, conditional_detr, convbert, convnext, convnextv2, cpmant, ctrl, cvt, dab-detr, dac, data2vec-audio, data2vec-text, data2vec-vision, dbrx, deberta, deberta-v2, decision_transformer, deepseek_v3, deformable_detr, deit, depth_anything, depth_pro, deta, detr, diffllama, dinat, dinov2, dinov2_with_registers, distilbert, donut-swin, dpr, dpt, efficientformer, efficientnet, electra, emu3, encodec, encoder-decoder, ernie, ernie_m, esm, falcon, falcon_mamba, fastspeech2_conformer, flaubert, flava, fnet, focalnet, fsmt, funnel, fuyu, gemma, gemma2, gemma3, gemma3_text, git, glm, glpn, got_ocr2, gpt-sw3, gpt2, gpt_bigcode, gpt_neo, gpt_neox, gpt_neox_japanese, gptj, gptsan-japanese, granite, granitemoe, granitemoeshared, granitevision, graphormer, grounding-dino, groupvit, helium, hiera, hubert, ibert, idefics, idefics2, idefics3, idefics3_vision, ijepa, imagegpt, informer, instructblip, instructblipvideo, jamba, jetmoe, jukebox, kosmos-2, layoutlm, layoutlmv2, layoutlmv3, led, levit, lilt, llama, llama4, llama4_text, llava, llava_next, llava_next_video, llava_onevision, longformer, longt5, luke, lxmert, m2m_100, mamba, mamba2, marian, markuplm, mask2former, maskformer, maskformer-swin, mbart, mctct, mega, megatron-bert, mgp-str, mimi, mistral, mistral3, mixtral, mllama, mobilebert, mobilenet_v1, mobilenet_v2, mobilevit, mobilevitv2, modernbert, moonshine, moshi, mpnet, mpt, mra, mt5, musicgen, musicgen_melody, mvp, nat, nemotron, nezha, nllb-moe, nougat, nystromformer, olmo, olmo2, olmoe, omdet-turbo, oneformer, open-llama, openai-gpt, opt, owlv2, owlvit, paligemma, patchtsmixer, patchtst, pegasus, pegasus_x, perceiver, persimmon, phi, phi3, phi4_multimodal, phimoe, pix2struct, pixtral, plbart, poolformer, pop2piano, prompt_depth_anything, prophetnet, pvt, pvt_v2, qdqbert, qwen2, qwen2_5_vl, qwen2_audio, qwen2_audio_encoder, qwen2_moe, qwen2_vl, qwen3, qwen3_moe, rag, realm, recurrent_gemma, reformer, regnet, rembert, resnet, retribert, roberta, roberta-prelayernorm, roc_bert, roformer, rt_detr, rt_detr_resnet, rt_detr_v2, rwkv, sam, sam_vision_model, seamless_m4t, seamless_m4t_v2, segformer, seggpt, sew, sew-d, shieldgemma2, siglip, siglip2, siglip_vision_model, smolvlm, smolvlm_vision, speech-encoder-decoder, speech_to_text, speech_to_text_2, speecht5, splinter, squeezebert, stablelm, starcoder2, superglue, superpoint, swiftformer, swin, swin2sr, swinv2, switch_transformers, t5, table-transformer, tapas, textnet, time_series_transformer, timesformer, timm_backbone, timm_wrapper, trajectory_transformer, transfo-xl, trocr, tvlt, tvp, udop, umt5, unispeech, unispeech-sat, univnet, upernet, van, video_llava, videomae, vilt, vipllava, vision-encoder-decoder, vision-text-dual-encoder, visual_bert, vit, vit_hybrid, vit_mae, vit_msn, vitdet, vitmatte, vitpose, vitpose_backbone, vits, vivit, wav2vec2, wav2vec2-bert, wav2vec2-conformer, wavlm, whisper, xclip, xglm, xlm, xlm-prophetnet, xlm-roberta, xlm-roberta-xl, xlnet, xmod, yolos, yoso, zamba, zamba2, zoedepth
```
### Who can help?
@gante
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-Coder-7B-Instruct")
```
Run this within the following modal function:
```
training_image = modal.Image.from_registry("nvidia/cuda:12.4.0-devel-ubuntu22.04", add_python="3.11") \
.pip_install(
"transformers==4.51.2",
) \
.env({"HF_HUB_ENABLE_HF_TRANSFER": "1"})
```
I ran into this on an 8xH100 in accelerate + deepspeed but I don't think this error is related to GPUs, since the tokenizer doesn't load either.
### Expected behavior
It loads.
|
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37477/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37477/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37476
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37476/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37476/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37476/events
|
https://github.com/huggingface/transformers/issues/37476
| 2,991,407,376
|
I_kwDOCUB6oc6yTUEQ
| 37,476
|
Incorrect installation instructions
|
{
"login": "DarkTyger",
"id": 152742585,
"node_id": "U_kgDOCRqquQ",
"avatar_url": "https://avatars.githubusercontent.com/u/152742585?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DarkTyger",
"html_url": "https://github.com/DarkTyger",
"followers_url": "https://api.github.com/users/DarkTyger/followers",
"following_url": "https://api.github.com/users/DarkTyger/following{/other_user}",
"gists_url": "https://api.github.com/users/DarkTyger/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DarkTyger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarkTyger/subscriptions",
"organizations_url": "https://api.github.com/users/DarkTyger/orgs",
"repos_url": "https://api.github.com/users/DarkTyger/repos",
"events_url": "https://api.github.com/users/DarkTyger/events{/privacy}",
"received_events_url": "https://api.github.com/users/DarkTyger/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2934977194,
"node_id": "MDU6TGFiZWwyOTM0OTc3MTk0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flax",
"name": "Flax",
"color": "4862AD",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-13T18:27:22
| 2025-05-24T08:02:27
| 2025-05-24T08:02:27
|
NONE
| null | null | null | null |
Re: https://pypi.org/project/transformers/
Is the Python version correct?
Python 3.9+
TensorFlow requires Python 3.9 to 3.12, according to:
https://www.tensorflow.org/install/pip
3.9+ implies the installation will also work with 3.13, which didn't work for me. Using 3.12.10 works. Does the Python version need to be 3.9 - 3.12.10?
Also, the instructions are a little lacking. Consider:
* a `requirements.txt` file that people can download;
* a shell script that can install and validate the requisite versions; and
* explicitly list all instructions that'll work for most people, such as:
python -m venv .my-env
source .my-env/bin/activate
pip install torch
pip install tensorflow
pip install flax
pip install transformers
pip install accelerate
Note that "accelerate" wasn't listed in the requirements.
### Who can help?
The web page at https://pypi.org/project/transformers/ is incorrect and arguably insufficient.
### Reproduction
No steps needed, just read the web page and follow the installation steps verbatim on an Arch Linux system running Python 13. The steps will fail.
### Expected behavior
The installation instructions are both technically correct and complete.
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37476/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37476/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37475
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37475/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37475/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37475/events
|
https://github.com/huggingface/transformers/pull/37475
| 2,991,404,566
|
PR_kwDOCUB6oc6SZpLP
| 37,475
|
trainer.py fix loss aggregation over multiple devices
|
{
"login": "wiwu2390",
"id": 140025193,
"node_id": "U_kgDOCFidaQ",
"avatar_url": "https://avatars.githubusercontent.com/u/140025193?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wiwu2390",
"html_url": "https://github.com/wiwu2390",
"followers_url": "https://api.github.com/users/wiwu2390/followers",
"following_url": "https://api.github.com/users/wiwu2390/following{/other_user}",
"gists_url": "https://api.github.com/users/wiwu2390/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wiwu2390/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wiwu2390/subscriptions",
"organizations_url": "https://api.github.com/users/wiwu2390/orgs",
"repos_url": "https://api.github.com/users/wiwu2390/repos",
"events_url": "https://api.github.com/users/wiwu2390/events{/privacy}",
"received_events_url": "https://api.github.com/users/wiwu2390/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-13T18:22:14
| 2025-04-13T18:23:11
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37475",
"html_url": "https://github.com/huggingface/transformers/pull/37475",
"diff_url": "https://github.com/huggingface/transformers/pull/37475.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37475.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #37474
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@zach-huggingface and @SunMarc
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37475/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37475/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37474
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37474/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37474/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37474/events
|
https://github.com/huggingface/transformers/issues/37474
| 2,991,399,599
|
I_kwDOCUB6oc6yTSKv
| 37,474
|
Trainer.training_step incorrectly normalizes mean token loss when n_gpu > 1
|
{
"login": "wiwu2390",
"id": 140025193,
"node_id": "U_kgDOCFidaQ",
"avatar_url": "https://avatars.githubusercontent.com/u/140025193?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wiwu2390",
"html_url": "https://github.com/wiwu2390",
"followers_url": "https://api.github.com/users/wiwu2390/followers",
"following_url": "https://api.github.com/users/wiwu2390/following{/other_user}",
"gists_url": "https://api.github.com/users/wiwu2390/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wiwu2390/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wiwu2390/subscriptions",
"organizations_url": "https://api.github.com/users/wiwu2390/orgs",
"repos_url": "https://api.github.com/users/wiwu2390/repos",
"events_url": "https://api.github.com/users/wiwu2390/events{/privacy}",
"received_events_url": "https://api.github.com/users/wiwu2390/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1990918270,
"node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue",
"name": "Good First Issue",
"color": "bbf794",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-13T18:12:18
| 2025-09-10T16:38:33
| null |
NONE
| null | null | null | null |
### System Info
```
- `transformers` version: 4.46.0
- Platform: Linux-5.15.0-136-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.29.2
- Safetensors version: 0.5.3
- Accelerate version: 1.4.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.4.1+cu121 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA RTX A5000
```
### Who can help?
@zach-huggingface @SunMarc @ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Full example setup:
```
config = AutoConfig.from_pretrained('EleutherAI/pythia-14m')
model = GPTNeoXForCausalLM(config=config).to('cuda')
tokenizer = AutoTokenizer.from_pretrained('EleutherAI/pythia-14m')
tokenizer.pad_token = tokenizer.eos_token
train_data = load_dataset("wiwu2390/minipile-100k", split="train")
def tokenize_function(sample):
return tokenizer(sample["text"], truncation=True, max_length=512)
tokenized_dataset = train_data.map(tokenize_function, batched=True, remove_columns=["text"])
data_collator = DataCollatorForLanguageModeling(
tokenizer=tokenizer, mlm=False
)
training_args = TrainingArguments(
output_dir="../data/pythia-14m-minipile-100k",
num_train_epochs=3,
per_device_train_batch_size=16,
per_device_eval_batch_size=16,
evaluation_strategy="no",
logging_steps=1,
save_steps=100,
learning_rate=1e-3,
weight_decay=0.01,
warmup_steps=100,
fp16=True,
)
trainer = Trainer(
model=model,
args=training_args,
train_dataset=tokenized_dataset,
tokenizer=tokenizer,
data_collator=data_collator,
)
trainer.train()
```
With 4 GPUs, the training loss at step 1 is ~2.7. However, the expected value is ~10.8. Indeed, this is what we get if we set CUDA_VISIBLE_DEVICES=0.
### Expected behavior
Since the model is being trained from initialization, the training loss at the first few steps should be around ~log(vocab_size)=10.8. However, when using 4 GPUs, the reported loss is 1/4 of that (2.7).
The reason that this is happening is that the DataParallel-wrapped model gets `num_items_in_batch` as an input kwarg in `Trainer.compute_loss`; this is equal to the number of tokens in the batch (combined across all devices). Each device gets a 1/4-size per-device batch and returns the sum of token losses divided by `num_items_in_batch` (see `transformers.loss.loss_utils.fixed_cross_entropy`). The correct way to aggregate these per-device losses is then to *sum* them. However, `Trainer.training_step` takes the mean:
https://github.com/huggingface/transformers/blob/953196a43dae6a3c474165fba7d215fcbc7b7730/src/transformers/trainer.py#L3759
A quick and dirty fix would be:
```
if self.args.n_gpu > 1:
loss = loss.mean() if num_items_in_batch is None else loss.sum()
```
I'm not sure if this is compatible with other workflows though.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37474/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37474/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37473
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37473/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37473/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37473/events
|
https://github.com/huggingface/transformers/pull/37473
| 2,991,323,018
|
PR_kwDOCUB6oc6SZZOG
| 37,473
|
Modular m4t speecht5 sew
|
{
"login": "nikosanto13",
"id": 57691096,
"node_id": "MDQ6VXNlcjU3NjkxMDk2",
"avatar_url": "https://avatars.githubusercontent.com/u/57691096?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikosanto13",
"html_url": "https://github.com/nikosanto13",
"followers_url": "https://api.github.com/users/nikosanto13/followers",
"following_url": "https://api.github.com/users/nikosanto13/following{/other_user}",
"gists_url": "https://api.github.com/users/nikosanto13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nikosanto13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nikosanto13/subscriptions",
"organizations_url": "https://api.github.com/users/nikosanto13/orgs",
"repos_url": "https://api.github.com/users/nikosanto13/repos",
"events_url": "https://api.github.com/users/nikosanto13/events{/privacy}",
"received_events_url": "https://api.github.com/users/nikosanto13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-13T15:33:40
| 2025-07-02T16:01:14
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37473",
"html_url": "https://github.com/huggingface/transformers/pull/37473",
"diff_url": "https://github.com/huggingface/transformers/pull/37473.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37473.patch",
"merged_at": null
}
|
# What does this PR do?
Completes the effort of #35902, by adding modular files for ~~SEW~~, SEW-D, SeamlessM4T, SeamlessM4Tv2, SpeechT5 ([as proposed](https://github.com/huggingface/transformers/pull/35902#pullrequestreview-2705453204) by @eustlb)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @eustlb
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37473/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37473/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37472
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37472/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37472/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37472/events
|
https://github.com/huggingface/transformers/pull/37472
| 2,991,046,389
|
PR_kwDOCUB6oc6SYipW
| 37,472
|
Fix wrong argparse type in modular checker script
|
{
"login": "seven-mile",
"id": 56445491,
"node_id": "MDQ6VXNlcjU2NDQ1NDkx",
"avatar_url": "https://avatars.githubusercontent.com/u/56445491?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seven-mile",
"html_url": "https://github.com/seven-mile",
"followers_url": "https://api.github.com/users/seven-mile/followers",
"following_url": "https://api.github.com/users/seven-mile/following{/other_user}",
"gists_url": "https://api.github.com/users/seven-mile/gists{/gist_id}",
"starred_url": "https://api.github.com/users/seven-mile/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/seven-mile/subscriptions",
"organizations_url": "https://api.github.com/users/seven-mile/orgs",
"repos_url": "https://api.github.com/users/seven-mile/repos",
"events_url": "https://api.github.com/users/seven-mile/events{/privacy}",
"received_events_url": "https://api.github.com/users/seven-mile/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-13T06:07:34
| 2025-04-14T15:11:30
| 2025-04-14T15:11:29
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37472",
"html_url": "https://github.com/huggingface/transformers/pull/37472",
"diff_url": "https://github.com/huggingface/transformers/pull/37472.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37472.patch",
"merged_at": "2025-04-14T15:11:29"
}
|
# What does this PR do?
In the `argparse` library, the `type` parameter is applied to each argument individually. When using `type=list` in combination with `nargs="+"`, the command `python utils/check_modular_conversion.py --files abc def` is parsed as `[['a', 'b', 'c'], ['d', 'e', 'f']]`. This behavior occurs because the script typically relies on no arguments being passed, defaulting to `['all']`, which has allowed it to function correctly until now.
This PR corrects this behaviour.
CC: @ArthurZucker
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37472/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37472/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37471
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37471/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37471/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37471/events
|
https://github.com/huggingface/transformers/issues/37471
| 2,990,875,478
|
I_kwDOCUB6oc6yRSNW
| 37,471
|
Assistant Decoding for Llava-Onevision Does Not Work
|
{
"login": "Brianzhengca",
"id": 30914529,
"node_id": "MDQ6VXNlcjMwOTE0NTI5",
"avatar_url": "https://avatars.githubusercontent.com/u/30914529?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Brianzhengca",
"html_url": "https://github.com/Brianzhengca",
"followers_url": "https://api.github.com/users/Brianzhengca/followers",
"following_url": "https://api.github.com/users/Brianzhengca/following{/other_user}",
"gists_url": "https://api.github.com/users/Brianzhengca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Brianzhengca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Brianzhengca/subscriptions",
"organizations_url": "https://api.github.com/users/Brianzhengca/orgs",
"repos_url": "https://api.github.com/users/Brianzhengca/repos",
"events_url": "https://api.github.com/users/Brianzhengca/events{/privacy}",
"received_events_url": "https://api.github.com/users/Brianzhengca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
open
| false
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null |
[] | 2025-04-13T00:49:05
| 2025-04-15T04:06:16
| null |
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.51.2
- Platform: Linux-4.18.0-513.18.1.el8_9.x86_64-x86_64-with-glibc2.28
- Python version: 3.12.2
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.4.4
- Accelerate version: 1.0.1
- Accelerate config: not found
- DeepSpeed version: 0.15.3
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA L40
### Who can help?
@zucchini-nlp
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Hi all, reproduction code is given below:
```
from transformers import AutoProcessor, AutoModelForImageTextToText
from PIL import Image
import torch
import requests
img_urls =["https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/cats.png",
"https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/bee.jpg"]
images = [Image.open(requests.get(img_urls[0], stream=True).raw),
Image.open(requests.get(img_urls[1], stream=True).raw)]
target_processor = AutoProcessor.from_pretrained("llava-hf/llava-onevision-qwen2-7b-ov-hf")
target_processor.tokenizer.padding_side = "left"
draft_processor = AutoProcessor.from_pretrained("llava-hf/llava-onevision-qwen2-0.5b-ov-hf")
draft_processor.tokenizer.padding_side = "left"
target = AutoModelForImageTextToText.from_pretrained("llava-hf/llava-onevision-qwen2-7b-ov-hf").to('cuda')
draft = AutoModelForImageTextToText.from_pretrained("llava-hf/llava-onevision-qwen2-0.5b-ov-hf").to('cuda')
messages = [
{
"role": "user",
"content": [
{"type": "image"},
{"type": "text", "text": "Describe this image in 500 words."}
]
}
]
prompt = target_processor.apply_chat_template(messages, add_generation_prompt=True)
print(prompt)
inputs = target_processor(text=prompt, images=[images[0]], return_tensors="pt").to("cuda")
with torch.no_grad():
generated_ids = target.generate(**inputs, max_new_tokens=1000, assistant_model=draft, tokenizer=target_processor.tokenizer, assistant_tokenizer=draft_processor.tokenizer)
generated_texts = target_processor.batch_decode(generated_ids, skip_special_tokens=True)
print(generated_texts)
```
The error I am getting is as follows:
```
Traceback (most recent call last):
File "my_path/Llava/main.py", line 34, in <module>
generated_ids = target.generate(**inputs, max_new_tokens=1000, assistant_model=draft, tokenizer=draft_processor.tokenizer, assistant_tokenizer=draft_processor.tokeniz
er)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/generation/utils.py", line 2409, in generate
result = self._assisted_decoding(
^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/generation/utils.py", line 4685, in _assisted_decoding
candidate_input_ids, candidate_logits = candidate_generator.get_candidates(input_ids)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/generation/candidate_generator.py", line 523, in get_candidates
assistant_output = self.assistant_model.generate(**generation_args, **self.assistant_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/generation/utils.py", line 2465, in generate
result = self._sample(
^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/generation/utils.py", line 3431, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "my_path/miniconda3/envs/open-instruct-run/lib/python3.12/site-packages/transformers/models/llava_onevision/modeling_llava_onevision.py", line 721, i
n forward
raise ValueError(
ValueError: Image features and image tokens do not match: tokens: 0, features 2709
```
I am really not sure where it might be going wrong, thank you so much!!
### Expected behavior
Should generate normally.
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37471/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/huggingface/transformers/issues/37471/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37470
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37470/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37470/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37470/events
|
https://github.com/huggingface/transformers/issues/37470
| 2,990,725,654
|
I_kwDOCUB6oc6yQtoW
| 37,470
|
modelling_llama -> spda_attention; ValueError: too many values to unpack (expected 4)
|
{
"login": "hpcpony",
"id": 49101811,
"node_id": "MDQ6VXNlcjQ5MTAxODEx",
"avatar_url": "https://avatars.githubusercontent.com/u/49101811?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hpcpony",
"html_url": "https://github.com/hpcpony",
"followers_url": "https://api.github.com/users/hpcpony/followers",
"following_url": "https://api.github.com/users/hpcpony/following{/other_user}",
"gists_url": "https://api.github.com/users/hpcpony/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hpcpony/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hpcpony/subscriptions",
"organizations_url": "https://api.github.com/users/hpcpony/orgs",
"repos_url": "https://api.github.com/users/hpcpony/repos",
"events_url": "https://api.github.com/users/hpcpony/events{/privacy}",
"received_events_url": "https://api.github.com/users/hpcpony/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T21:20:25
| 2025-04-17T13:51:14
| 2025-04-17T13:51:14
|
NONE
| null | null | null | null |
### System Info
torch = 2.6.0
transformers = 4.51.0 and 4.51.2
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
I'm poking around just trying to do a simple "hello world" (being a newbie and all). I've managed to fill in the blanks and actually get to the point where the Trainer will run. I'm sure there are plenty of problems with what I'm doing and how I'm doing it, but I'm stuck at this error. It's not obvious what I'm doing to cause it.
With the following code...
```
# Util.py
from transformers import LlamaConfig, LlamaForCausalLM
from transformers import AutoTokenizer, AutoModelForCausalLM
from datasets import Dataset
import os
##
## Load an empty model
##
def load_model(model_dir, TRACE=False) :
rank = os.environ.get('RANK')
# get just the config from the pretrained directory
if TRACE : print("[{}] Loading Configuration".format(rank))
config = LlamaConfig.from_pretrained(model_dir)
if TRACE : print("Config:\n", config)
# create an empty model based on just the confic (i.e. no weights)
if TRACE : print("[{}] Loading Model".format(rank))
model = LlamaForCausalLM(config)
if TRACE : print("Model:\n", model)
# get the tokenizer
print("[{}] Loading Tokenizer".format(rank))
tokenizer = AutoTokenizer.from_pretrained(model_dir)
tokenizer.pad_token = "<|finetune_right_pad_id|>" # for PreTrainedTokenizerFast and Llama 3.1
if TRACE : print("Tokenizer:\n", tokenizer)
# return model and tokenizer
return model, tokenizer
##
## Create some training data
##
def load_dataset(TRACE=False) :
rank = os.environ.get('RANK')
if TRACE : print("[{}] Loading Dataset".format(rank))
# make up data with a couple of entries
# dataset = Dataset.from_dict({
# "input_ids":["The quick brown fox jumped over the lazy dog's back"]*20
# })
dataset = Dataset.from_dict({
"text":[["The quick brown fox jumped over the lazy dog's back"*20]]
})
# synthesize a bogus dataset with 'train' and 'test' sets
datasets = {
'train': dataset,
'test' : dataset
}
if TRACE :
print("Dataset\n")
print("train\n")
pprint.pp(datasets['train'][0:5])
print("test\n")
pprint.pp(datasets['test'][0:5])
return datasets
```
and...
```
# 00_trainer_8B.py
import os
import time
import Util
from transformers import Trainer, TrainingArguments, DataCollatorWithPadding
model_dir = '/work/AI/Models/NousResearch/Meta-Llama-3.1-8B'
TRACE = True
from torch.distributed.elastic.multiprocessing.errors import record
import torch.distributed
from datetime import timedelta
@record
def main():
# setup process group to not time out 'cause our slow systems
torch.distributed.init_process_group(
timeout=timedelta(minutes=20)
)
rank = os.environ.get('RANK')
# get empty model and tokenizer
model, tokenizer = Util.load_model(model_dir, TRACE=TRACE)
# get bogus dataset
dataset = Util.load_dataset(TRACE=False)
# setup the model to the GPU
if TRACE : print("[{}] Moving model to GPU".format(rank))
model.to('cuda')
##
## Do the training
## from: https://huggingface.co/docs/transformers/en/trainer
##
if TRACE : print("[{}] Setting up training".format(rank))
# setup the parameters for the training
training_args = TrainingArguments(
output_dir="outputs/"+model_dir.split('/')[-1],
learning_rate=2e-5,
per_device_train_batch_size=4,
per_device_eval_batch_size=4,
num_train_epochs=4,
weight_decay=0.01,
eval_strategy="epoch",
save_strategy="no", # no, epoch, steps, best
bf16=True,
#label_names=['text'], # huh?
#local_rank= ,
#load_best_model_at_end=True,
push_to_hub=False,
)
# Tokenize the datasets
dataset['train'] = dataset['train'].map(lambda x: tokenizer(x['text'])) # tokenize explicitly
dataset['test'] = dataset['train'].map(lambda x: tokenizer(x['text']))
print("Train> ", dataset['train'])
print("Test> ", dataset['test'])
# instantiate the Trainer
data_collator = DataCollatorWithPadding(tokenizer) # default
trainer = Trainer(
model=model,
args=training_args,
train_dataset=dataset["train"],
eval_dataset=dataset["test"],
processing_class=tokenizer,
data_collator=data_collator,
#compute_metrics=compute_metrics,
)
# Train...
if TRACE : print("[{}] Training Start...".format(rank))
trainer.train()
if TRACE : print("[{}] ...Training Done".format(rank))
import torch
if __name__ == "__main__" :
print("CUDA is available ",torch.cuda.is_available())
print("Devices ", torch.cuda.device_count())
main()
```
Here's execution....
```
[gpu:trl] torchrun --standalone --nnodes 1 00_trainer_8B.py
CUDA is available True
Devices 8
[0] Loading Configuration
Config:
LlamaConfig {
"architectures": [
"LlamaForCausalLM"
],
"attention_bias": false,
"attention_dropout": 0.0,
"bos_token_id": 128000,
"eos_token_id": 128001,
"head_dim": 128,
"hidden_act": "silu",
"hidden_size": 4096,
"initializer_range": 0.02,
"intermediate_size": 14336,
"max_position_embeddings": 131072,
"mlp_bias": false,
"model_type": "llama",
"num_attention_heads": 32,
"num_hidden_layers": 32,
"num_key_value_heads": 8,
"pretraining_tp": 1,
"rms_norm_eps": 1e-05,
"rope_scaling": {
"factor": 8.0,
"high_freq_factor": 4.0,
"low_freq_factor": 1.0,
"original_max_position_embeddings": 8192,
"rope_type": "llama3"
},
"rope_theta": 500000.0,
"tie_word_embeddings": false,
"torch_dtype": "bfloat16",
"transformers_version": "4.51.2",
"use_cache": true,
"vocab_size": 128256
}
[0] Loading Model
Model:
LlamaForCausalLM(
(model): LlamaModel(
(embed_tokens): Embedding(128256, 4096)
(layers): ModuleList(
(0-31): 32 x LlamaDecoderLayer(
(self_attn): LlamaAttention(
(q_proj): Linear(in_features=4096, out_features=4096, bias=False)
(k_proj): Linear(in_features=4096, out_features=1024, bias=False)
(v_proj): Linear(in_features=4096, out_features=1024, bias=False)
(o_proj): Linear(in_features=4096, out_features=4096, bias=False)
)
(mlp): LlamaMLP(
(gate_proj): Linear(in_features=4096, out_features=14336, bias=False)
(up_proj): Linear(in_features=4096, out_features=14336, bias=False)
(down_proj): Linear(in_features=14336, out_features=4096, bias=False)
(act_fn): SiLU()
)
(input_layernorm): LlamaRMSNorm((4096,), eps=1e-05)
(post_attention_layernorm): LlamaRMSNorm((4096,), eps=1e-05)
)
)
(norm): LlamaRMSNorm((4096,), eps=1e-05)
(rotary_emb): LlamaRotaryEmbedding()
)
(lm_head): Linear(in_features=4096, out_features=128256, bias=False)
)
[0] Loading Tokenizer
Tokenizer:
PreTrainedTokenizerFast(name_or_path='/work/AI/Models/NousResearch/Meta-Llama-3.1-8B', vocab_size=128000, model_max_length=131072, is_fast=True, padding_side='right', truncation_side='right', special_tokens={'bos_token': '<|begin_of_text|>', 'eos_token': '<|end_of_text|>', 'pad_token': '<|finetune_right_pad_id|>'}, clean_up_tokenization_spaces=True, added_tokens_decoder={
128000: AddedToken("<|begin_of_text|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128001: AddedToken("<|end_of_text|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128002: AddedToken("<|reserved_special_token_0|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128003: AddedToken("<|reserved_special_token_1|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128004: AddedToken("<|finetune_right_pad_id|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128005: AddedToken("<|reserved_special_token_2|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128006: AddedToken("<|start_header_id|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128007: AddedToken("<|end_header_id|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128008: AddedToken("<|eom_id|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128009: AddedToken("<|eot_id|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128010: AddedToken("<|python_tag|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128011: AddedToken("<|reserved_special_token_3|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128012: AddedToken("<|reserved_special_token_4|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128013: AddedToken("<|reserved_special_token_5|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128014: AddedToken("<|reserved_special_token_6|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128015: AddedToken("<|reserved_special_token_7|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128016: AddedToken("<|reserved_special_token_8|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128017: AddedToken("<|reserved_special_token_9|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128018: AddedToken("<|reserved_special_token_10|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128019: AddedToken("<|reserved_special_token_11|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128020: AddedToken("<|reserved_special_token_12|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128021: AddedToken("<|reserved_special_token_13|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128022: AddedToken("<|reserved_special_token_14|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128023: AddedToken("<|reserved_special_token_15|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128024: AddedToken("<|reserved_special_token_16|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128025: AddedToken("<|reserved_special_token_17|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128026: AddedToken("<|reserved_special_token_18|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128027: AddedToken("<|reserved_special_token_19|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128028: AddedToken("<|reserved_special_token_20|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128029: AddedToken("<|reserved_special_token_21|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128030: AddedToken("<|reserved_special_token_22|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128031: AddedToken("<|reserved_special_token_23|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128032: AddedToken("<|reserved_special_token_24|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128033: AddedToken("<|reserved_special_token_25|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128034: AddedToken("<|reserved_special_token_26|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128035: AddedToken("<|reserved_special_token_27|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128036: AddedToken("<|reserved_special_token_28|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128037: AddedToken("<|reserved_special_token_29|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128038: AddedToken("<|reserved_special_token_30|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128039: AddedToken("<|reserved_special_token_31|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128040: AddedToken("<|reserved_special_token_32|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128041: AddedToken("<|reserved_special_token_33|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128042: AddedToken("<|reserved_special_token_34|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128043: AddedToken("<|reserved_special_token_35|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128044: AddedToken("<|reserved_special_token_36|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128045: AddedToken("<|reserved_special_token_37|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128046: AddedToken("<|reserved_special_token_38|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128047: AddedToken("<|reserved_special_token_39|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128048: AddedToken("<|reserved_special_token_40|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128049: AddedToken("<|reserved_special_token_41|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128050: AddedToken("<|reserved_special_token_42|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128051: AddedToken("<|reserved_special_token_43|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128052: AddedToken("<|reserved_special_token_44|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128053: AddedToken("<|reserved_special_token_45|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128054: AddedToken("<|reserved_special_token_46|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128055: AddedToken("<|reserved_special_token_47|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128056: AddedToken("<|reserved_special_token_48|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128057: AddedToken("<|reserved_special_token_49|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128058: AddedToken("<|reserved_special_token_50|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128059: AddedToken("<|reserved_special_token_51|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128060: AddedToken("<|reserved_special_token_52|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128061: AddedToken("<|reserved_special_token_53|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128062: AddedToken("<|reserved_special_token_54|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128063: AddedToken("<|reserved_special_token_55|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128064: AddedToken("<|reserved_special_token_56|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128065: AddedToken("<|reserved_special_token_57|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128066: AddedToken("<|reserved_special_token_58|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128067: AddedToken("<|reserved_special_token_59|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128068: AddedToken("<|reserved_special_token_60|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128069: AddedToken("<|reserved_special_token_61|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128070: AddedToken("<|reserved_special_token_62|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128071: AddedToken("<|reserved_special_token_63|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128072: AddedToken("<|reserved_special_token_64|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128073: AddedToken("<|reserved_special_token_65|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128074: AddedToken("<|reserved_special_token_66|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128075: AddedToken("<|reserved_special_token_67|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128076: AddedToken("<|reserved_special_token_68|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128077: AddedToken("<|reserved_special_token_69|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128078: AddedToken("<|reserved_special_token_70|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128079: AddedToken("<|reserved_special_token_71|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128080: AddedToken("<|reserved_special_token_72|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128081: AddedToken("<|reserved_special_token_73|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128082: AddedToken("<|reserved_special_token_74|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128083: AddedToken("<|reserved_special_token_75|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128084: AddedToken("<|reserved_special_token_76|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128085: AddedToken("<|reserved_special_token_77|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128086: AddedToken("<|reserved_special_token_78|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128087: AddedToken("<|reserved_special_token_79|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128088: AddedToken("<|reserved_special_token_80|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128089: AddedToken("<|reserved_special_token_81|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128090: AddedToken("<|reserved_special_token_82|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128091: AddedToken("<|reserved_special_token_83|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128092: AddedToken("<|reserved_special_token_84|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128093: AddedToken("<|reserved_special_token_85|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128094: AddedToken("<|reserved_special_token_86|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128095: AddedToken("<|reserved_special_token_87|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128096: AddedToken("<|reserved_special_token_88|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128097: AddedToken("<|reserved_special_token_89|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128098: AddedToken("<|reserved_special_token_90|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128099: AddedToken("<|reserved_special_token_91|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128100: AddedToken("<|reserved_special_token_92|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128101: AddedToken("<|reserved_special_token_93|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128102: AddedToken("<|reserved_special_token_94|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128103: AddedToken("<|reserved_special_token_95|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128104: AddedToken("<|reserved_special_token_96|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128105: AddedToken("<|reserved_special_token_97|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128106: AddedToken("<|reserved_special_token_98|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128107: AddedToken("<|reserved_special_token_99|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128108: AddedToken("<|reserved_special_token_100|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128109: AddedToken("<|reserved_special_token_101|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128110: AddedToken("<|reserved_special_token_102|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128111: AddedToken("<|reserved_special_token_103|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128112: AddedToken("<|reserved_special_token_104|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128113: AddedToken("<|reserved_special_token_105|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128114: AddedToken("<|reserved_special_token_106|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128115: AddedToken("<|reserved_special_token_107|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128116: AddedToken("<|reserved_special_token_108|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128117: AddedToken("<|reserved_special_token_109|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128118: AddedToken("<|reserved_special_token_110|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128119: AddedToken("<|reserved_special_token_111|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128120: AddedToken("<|reserved_special_token_112|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128121: AddedToken("<|reserved_special_token_113|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128122: AddedToken("<|reserved_special_token_114|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128123: AddedToken("<|reserved_special_token_115|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128124: AddedToken("<|reserved_special_token_116|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128125: AddedToken("<|reserved_special_token_117|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128126: AddedToken("<|reserved_special_token_118|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128127: AddedToken("<|reserved_special_token_119|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128128: AddedToken("<|reserved_special_token_120|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128129: AddedToken("<|reserved_special_token_121|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128130: AddedToken("<|reserved_special_token_122|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128131: AddedToken("<|reserved_special_token_123|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128132: AddedToken("<|reserved_special_token_124|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128133: AddedToken("<|reserved_special_token_125|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128134: AddedToken("<|reserved_special_token_126|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128135: AddedToken("<|reserved_special_token_127|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128136: AddedToken("<|reserved_special_token_128|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128137: AddedToken("<|reserved_special_token_129|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128138: AddedToken("<|reserved_special_token_130|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128139: AddedToken("<|reserved_special_token_131|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128140: AddedToken("<|reserved_special_token_132|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128141: AddedToken("<|reserved_special_token_133|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128142: AddedToken("<|reserved_special_token_134|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128143: AddedToken("<|reserved_special_token_135|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128144: AddedToken("<|reserved_special_token_136|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128145: AddedToken("<|reserved_special_token_137|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128146: AddedToken("<|reserved_special_token_138|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128147: AddedToken("<|reserved_special_token_139|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128148: AddedToken("<|reserved_special_token_140|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128149: AddedToken("<|reserved_special_token_141|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128150: AddedToken("<|reserved_special_token_142|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128151: AddedToken("<|reserved_special_token_143|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128152: AddedToken("<|reserved_special_token_144|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128153: AddedToken("<|reserved_special_token_145|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128154: AddedToken("<|reserved_special_token_146|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128155: AddedToken("<|reserved_special_token_147|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128156: AddedToken("<|reserved_special_token_148|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128157: AddedToken("<|reserved_special_token_149|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128158: AddedToken("<|reserved_special_token_150|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128159: AddedToken("<|reserved_special_token_151|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128160: AddedToken("<|reserved_special_token_152|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128161: AddedToken("<|reserved_special_token_153|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128162: AddedToken("<|reserved_special_token_154|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128163: AddedToken("<|reserved_special_token_155|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128164: AddedToken("<|reserved_special_token_156|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128165: AddedToken("<|reserved_special_token_157|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128166: AddedToken("<|reserved_special_token_158|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128167: AddedToken("<|reserved_special_token_159|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128168: AddedToken("<|reserved_special_token_160|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128169: AddedToken("<|reserved_special_token_161|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128170: AddedToken("<|reserved_special_token_162|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128171: AddedToken("<|reserved_special_token_163|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128172: AddedToken("<|reserved_special_token_164|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128173: AddedToken("<|reserved_special_token_165|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128174: AddedToken("<|reserved_special_token_166|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128175: AddedToken("<|reserved_special_token_167|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128176: AddedToken("<|reserved_special_token_168|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128177: AddedToken("<|reserved_special_token_169|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128178: AddedToken("<|reserved_special_token_170|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128179: AddedToken("<|reserved_special_token_171|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128180: AddedToken("<|reserved_special_token_172|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128181: AddedToken("<|reserved_special_token_173|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128182: AddedToken("<|reserved_special_token_174|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128183: AddedToken("<|reserved_special_token_175|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128184: AddedToken("<|reserved_special_token_176|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128185: AddedToken("<|reserved_special_token_177|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128186: AddedToken("<|reserved_special_token_178|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128187: AddedToken("<|reserved_special_token_179|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128188: AddedToken("<|reserved_special_token_180|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128189: AddedToken("<|reserved_special_token_181|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128190: AddedToken("<|reserved_special_token_182|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128191: AddedToken("<|reserved_special_token_183|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128192: AddedToken("<|reserved_special_token_184|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128193: AddedToken("<|reserved_special_token_185|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128194: AddedToken("<|reserved_special_token_186|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128195: AddedToken("<|reserved_special_token_187|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128196: AddedToken("<|reserved_special_token_188|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128197: AddedToken("<|reserved_special_token_189|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128198: AddedToken("<|reserved_special_token_190|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128199: AddedToken("<|reserved_special_token_191|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128200: AddedToken("<|reserved_special_token_192|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128201: AddedToken("<|reserved_special_token_193|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128202: AddedToken("<|reserved_special_token_194|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128203: AddedToken("<|reserved_special_token_195|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128204: AddedToken("<|reserved_special_token_196|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128205: AddedToken("<|reserved_special_token_197|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128206: AddedToken("<|reserved_special_token_198|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128207: AddedToken("<|reserved_special_token_199|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128208: AddedToken("<|reserved_special_token_200|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128209: AddedToken("<|reserved_special_token_201|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128210: AddedToken("<|reserved_special_token_202|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128211: AddedToken("<|reserved_special_token_203|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128212: AddedToken("<|reserved_special_token_204|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128213: AddedToken("<|reserved_special_token_205|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128214: AddedToken("<|reserved_special_token_206|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128215: AddedToken("<|reserved_special_token_207|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128216: AddedToken("<|reserved_special_token_208|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128217: AddedToken("<|reserved_special_token_209|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128218: AddedToken("<|reserved_special_token_210|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128219: AddedToken("<|reserved_special_token_211|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128220: AddedToken("<|reserved_special_token_212|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128221: AddedToken("<|reserved_special_token_213|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128222: AddedToken("<|reserved_special_token_214|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128223: AddedToken("<|reserved_special_token_215|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128224: AddedToken("<|reserved_special_token_216|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128225: AddedToken("<|reserved_special_token_217|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128226: AddedToken("<|reserved_special_token_218|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128227: AddedToken("<|reserved_special_token_219|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128228: AddedToken("<|reserved_special_token_220|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128229: AddedToken("<|reserved_special_token_221|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128230: AddedToken("<|reserved_special_token_222|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128231: AddedToken("<|reserved_special_token_223|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128232: AddedToken("<|reserved_special_token_224|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128233: AddedToken("<|reserved_special_token_225|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128234: AddedToken("<|reserved_special_token_226|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128235: AddedToken("<|reserved_special_token_227|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128236: AddedToken("<|reserved_special_token_228|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128237: AddedToken("<|reserved_special_token_229|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128238: AddedToken("<|reserved_special_token_230|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128239: AddedToken("<|reserved_special_token_231|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128240: AddedToken("<|reserved_special_token_232|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128241: AddedToken("<|reserved_special_token_233|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128242: AddedToken("<|reserved_special_token_234|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128243: AddedToken("<|reserved_special_token_235|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128244: AddedToken("<|reserved_special_token_236|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128245: AddedToken("<|reserved_special_token_237|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128246: AddedToken("<|reserved_special_token_238|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128247: AddedToken("<|reserved_special_token_239|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128248: AddedToken("<|reserved_special_token_240|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128249: AddedToken("<|reserved_special_token_241|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128250: AddedToken("<|reserved_special_token_242|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128251: AddedToken("<|reserved_special_token_243|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128252: AddedToken("<|reserved_special_token_244|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128253: AddedToken("<|reserved_special_token_245|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128254: AddedToken("<|reserved_special_token_246|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
128255: AddedToken("<|reserved_special_token_247|>", rstrip=False, lstrip=False, single_word=False, normalized=False, special=True),
}
)
[0] Moving model to GPU
[0] Setting up training
Map: 100%|█████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 20.59 examples/s]
Map: 100%|████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 261.38 examples/s]
Train> Dataset({
features: ['text', 'input_ids', 'attention_mask'],
num_rows: 1
})
Test> Dataset({
features: ['text', 'input_ids', 'attention_mask'],
num_rows: 1
})
[0] Training Start...
0%| | 0/4 [00:00<?, ?it/s][rank0]: Traceback (most recent call last):
[rank0]: File "/work/trl/00_trainer_8B.py", line 75, in <module>
[rank0]: main()
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
[rank0]: return f(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^
[rank0]: File "/work/trl/00_trainer_8B.py", line 68, in main
[rank0]: trainer.train()
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 2245, in train
[rank0]: return inner_training_loop(
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 2560, in _inner_training_loop
[rank0]: tr_loss_step = self.training_step(model, inputs, num_items_in_batch)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 3736, in training_step
[rank0]: loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 3801, in compute_loss
[rank0]: outputs = model(**inputs)
[rank0]: ^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank0]: return self._call_impl(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank0]: return forward_call(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1643, in forward
[rank0]: else self._run_ddp_forward(*inputs, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1459, in _run_ddp_forward
[rank0]: return self.module(*inputs, **kwargs) # type: ignore[index]
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank0]: return self._call_impl(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank0]: return forward_call(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/accelerate/utils/operations.py", line 814, in forward
[rank0]: return model_forward(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/accelerate/utils/operations.py", line 802, in __call__
[rank0]: return convert_to_fp32(self.model_forward(*args, **kwargs))
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/amp/autocast_mode.py", line 44, in decorate_autocast
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/generic.py", line 965, in wrapper
[rank0]: output = func(self, *args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
[rank0]: return func(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 821, in forward
[rank0]: outputs: BaseModelOutputWithPast = self.model(
[rank0]: ^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank0]: return self._call_impl(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank0]: return forward_call(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/generic.py", line 965, in wrapper
[rank0]: output = func(self, *args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 571, in forward
[rank0]: layer_outputs = decoder_layer(
[rank0]: ^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank0]: return self._call_impl(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank0]: return forward_call(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 318, in forward
[rank0]: hidden_states, self_attn_weights = self.self_attn(
[rank0]: ^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
[rank0]: return self._call_impl(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
[rank0]: return forward_call(*args, **kwargs)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 274, in forward
[rank0]: attn_output, attn_weights = attention_interface(
[rank0]: ^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/integrations/sdpa_attention.py", line 30, in sdpa_attention_forward
[rank0]: key = repeat_kv(key, module.num_key_value_groups)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/integrations/sdpa_attention.py", line 11, in repeat_kv
[rank0]: batch, num_key_value_heads, slen, head_dim = hidden_states.shape
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: ValueError: too many values to unpack (expected 4)
0%| | 0/4 [00:00<?, ?it/s]
[rank0]:[W412 16:57:36.231382427 ProcessGroupNCCL.cpp:1496] Warning: WARNING: destroy_process_group() was not called before program exit, which can leak resources. For more info, please see https://pytorch.org/docs/stable/distributed.html#shutdown (function operator())
E0412 16:57:44.945000 263707 /opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 0 (pid: 263778) of binary: /opt/AI/trl-0.16.1/bin/python
E0412 16:57:44.963000 263707 /opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/error_handler.py:141] no error file defined for parent, to copy child error file (/tmp/torchelastic_4cfdgcr3/2cd61db4-dfff-458a-9ec7-954de88c8652_ms05csxo/attempt_0/0/error.json)
Traceback (most recent call last):
File "/opt/AI/trl-0.16.1/bin/torchrun", line 8, in <module>
sys.exit(main())
^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/run.py", line 918, in main
run(args)
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/run.py", line 909, in run
elastic_launch(
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 138, in __call__
return launch_agent(self._config, self._entrypoint, list(args))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent
raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError:
============================================================
00_trainer_8B.py FAILED
------------------------------------------------------------
Failures:
<NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
time : 2025-04-12_16:57:35
host : gpu.super.org
rank : 0 (local_rank: 0)
exitcode : 1 (pid: 263778)
error_file: /tmp/torchelastic_4cfdgcr3/2cd61db4-dfff-458a-9ec7-954de88c8652_ms05csxo/attempt_0/0/error.json
traceback : Traceback (most recent call last):
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/work/trl/00_trainer_8B.py", line 68, in main
trainer.train()
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 2245, in train
return inner_training_loop(
^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 2560, in _inner_training_loop
tr_loss_step = self.training_step(model, inputs, num_items_in_batch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 3736, in training_step
loss = self.compute_loss(model, inputs, num_items_in_batch=num_items_in_batch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/trainer.py", line 3801, in compute_loss
outputs = model(**inputs)
^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1643, in forward
else self._run_ddp_forward(*inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/parallel/distributed.py", line 1459, in _run_ddp_forward
return self.module(*inputs, **kwargs) # type: ignore[index]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/accelerate/utils/operations.py", line 814, in forward
return model_forward(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/accelerate/utils/operations.py", line 802, in __call__
return convert_to_fp32(self.model_forward(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/amp/autocast_mode.py", line 44, in decorate_autocast
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/generic.py", line 965, in wrapper
output = func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 821, in forward
outputs: BaseModelOutputWithPast = self.model(
^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/utils/generic.py", line 965, in wrapper
output = func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 571, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 318, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/models/llama/modeling_llama.py", line 274, in forward
attn_output, attn_weights = attention_interface(
^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/integrations/sdpa_attention.py", line 30, in sdpa_attention_forward
key = repeat_kv(key, module.num_key_value_groups)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/AI/trl-0.16.1/lib/python3.12/site-packages/transformers/integrations/sdpa_attention.py", line 11, in repeat_kv
batch, num_key_value_heads, slen, head_dim = hidden_states.shape
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: too many values to unpack (expected 4)
============================================================
```
I poked around and the value of hidden_states.shape is: torch.Size([1, 221, 1, 8, 128])
### Expected behavior
Informative/useful error message, or no value error
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37470/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37470/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37469
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37469/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37469/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37469/events
|
https://github.com/huggingface/transformers/issues/37469
| 2,990,700,876
|
I_kwDOCUB6oc6yQnlM
| 37,469
|
apply_chat_template() function, in particular with the chat_template = "rag"
|
{
"login": "willxxy",
"id": 90741489,
"node_id": "MDQ6VXNlcjkwNzQxNDg5",
"avatar_url": "https://avatars.githubusercontent.com/u/90741489?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/willxxy",
"html_url": "https://github.com/willxxy",
"followers_url": "https://api.github.com/users/willxxy/followers",
"following_url": "https://api.github.com/users/willxxy/following{/other_user}",
"gists_url": "https://api.github.com/users/willxxy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/willxxy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/willxxy/subscriptions",
"organizations_url": "https://api.github.com/users/willxxy/orgs",
"repos_url": "https://api.github.com/users/willxxy/repos",
"events_url": "https://api.github.com/users/willxxy/events{/privacy}",
"received_events_url": "https://api.github.com/users/willxxy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T20:15:50
| 2025-04-14T14:33:47
| 2025-04-14T14:33:45
|
NONE
| null | null | null | null |
I am trying to figure out what's the best way to format the chat template with rag. I am following the tutorial [here](https://huggingface.co/docs/transformers/main/en/chat_extras). The only difference is that the model card for the tutorial is using `CohereForAI/c4ai-command-r-v01-4bit`. When I follow the tutorial using a different model card, in this example `meta-llama/Llama-3.2-1B-Instruct`, the printed output of `print(tokenizer.decode(input_ids))` is "rag". Is this an expected behavior? I am guessing this may be because RAG is not supported for llama 3.2 1b instruct per [the documentation in this code](https://github.com/huggingface/transformers/blob/main/src/transformers/tokenization_utils_base.py#L1530). Although I read this and I tried looking at the [linked website](https://huggingface.co/docs/transformers/main/en/chat_templating#automated-function-conversion-for-tool-use) in the documentation, however, I am still left confused. Right now I am constructing this myself, but I wanted to see what the best practice of injecting the RAG content (e.g., in the system prompt? before the user query? etc.)
```
documents = [
{
"title": "The Moon: Our Age-Old Foe",
"text": "Man has always dreamed of destroying the moon. In this essay, I shall..."
},
{
"title": "The Sun: Our Age-Old Friend",
"text": "Although often underappreciated, the sun provides several notable benefits..."
}
]
from transformers import AutoTokenizer, AutoModelForCausalLM
# Load the model and tokenizer
model_card = "meta-llama/Llama-3.2-1B-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_card)
model = AutoModelForCausalLM.from_pretrained(model_card, device_map="auto")
device = model.device # Get the device the model is loaded on
# Define conversation input
conversation = [
{"role": "user", "content": "What has Man always dreamed of?"}
]
input_ids = tokenizer.apply_chat_template(
conversation=conversation,
documents=documents,
chat_template="rag",
tokenize=True,
add_generation_prompt=True,
return_tensors="pt").to(device)
print(tokenizer.decode(input_ids))
# Generate a response
generated_tokens = model.generate(
input_ids,
max_new_tokens=100,
do_sample=True,
temperature=0.3,
)
# Decode and print the generated text along with generation prompt
generated_text = tokenizer.decode(generated_tokens[0])
print(generated_text)
```
|
{
"login": "willxxy",
"id": 90741489,
"node_id": "MDQ6VXNlcjkwNzQxNDg5",
"avatar_url": "https://avatars.githubusercontent.com/u/90741489?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/willxxy",
"html_url": "https://github.com/willxxy",
"followers_url": "https://api.github.com/users/willxxy/followers",
"following_url": "https://api.github.com/users/willxxy/following{/other_user}",
"gists_url": "https://api.github.com/users/willxxy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/willxxy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/willxxy/subscriptions",
"organizations_url": "https://api.github.com/users/willxxy/orgs",
"repos_url": "https://api.github.com/users/willxxy/repos",
"events_url": "https://api.github.com/users/willxxy/events{/privacy}",
"received_events_url": "https://api.github.com/users/willxxy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37469/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37469/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37468
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37468/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37468/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37468/events
|
https://github.com/huggingface/transformers/pull/37468
| 2,990,578,803
|
PR_kwDOCUB6oc6SXAhb
| 37,468
|
Llama4: remove redundant transpose of router_logits
|
{
"login": "pbelevich",
"id": 1160355,
"node_id": "MDQ6VXNlcjExNjAzNTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/1160355?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pbelevich",
"html_url": "https://github.com/pbelevich",
"followers_url": "https://api.github.com/users/pbelevich/followers",
"following_url": "https://api.github.com/users/pbelevich/following{/other_user}",
"gists_url": "https://api.github.com/users/pbelevich/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pbelevich/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pbelevich/subscriptions",
"organizations_url": "https://api.github.com/users/pbelevich/orgs",
"repos_url": "https://api.github.com/users/pbelevich/repos",
"events_url": "https://api.github.com/users/pbelevich/events{/privacy}",
"received_events_url": "https://api.github.com/users/pbelevich/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T17:08:20
| 2025-04-15T11:29:27
| 2025-04-15T11:29:27
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37468",
"html_url": "https://github.com/huggingface/transformers/pull/37468",
"diff_url": "https://github.com/huggingface/transformers/pull/37468.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37468.patch",
"merged_at": "2025-04-15T11:29:27"
}
|
# What does this PR do?
This PR removes redundant transpose of router_logits
## Who can review?
cc @ArthurZucker
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37468/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37468/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37467
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37467/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37467/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37467/events
|
https://github.com/huggingface/transformers/pull/37467
| 2,990,548,581
|
PR_kwDOCUB6oc6SW6vA
| 37,467
|
Update README.md
|
{
"login": "ykoseali",
"id": 110221212,
"node_id": "U_kgDOBpHXnA",
"avatar_url": "https://avatars.githubusercontent.com/u/110221212?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ykoseali",
"html_url": "https://github.com/ykoseali",
"followers_url": "https://api.github.com/users/ykoseali/followers",
"following_url": "https://api.github.com/users/ykoseali/following{/other_user}",
"gists_url": "https://api.github.com/users/ykoseali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ykoseali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ykoseali/subscriptions",
"organizations_url": "https://api.github.com/users/ykoseali/orgs",
"repos_url": "https://api.github.com/users/ykoseali/repos",
"events_url": "https://api.github.com/users/ykoseali/events{/privacy}",
"received_events_url": "https://api.github.com/users/ykoseali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T16:07:35
| 2025-04-12T16:12:06
| 2025-04-12T16:12:06
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37467",
"html_url": "https://github.com/huggingface/transformers/pull/37467",
"diff_url": "https://github.com/huggingface/transformers/pull/37467.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37467.patch",
"merged_at": null
}
|
This aims to fix the grammatical error at the start of the README.md providing a better look
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ykoseali",
"id": 110221212,
"node_id": "U_kgDOBpHXnA",
"avatar_url": "https://avatars.githubusercontent.com/u/110221212?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ykoseali",
"html_url": "https://github.com/ykoseali",
"followers_url": "https://api.github.com/users/ykoseali/followers",
"following_url": "https://api.github.com/users/ykoseali/following{/other_user}",
"gists_url": "https://api.github.com/users/ykoseali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ykoseali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ykoseali/subscriptions",
"organizations_url": "https://api.github.com/users/ykoseali/orgs",
"repos_url": "https://api.github.com/users/ykoseali/repos",
"events_url": "https://api.github.com/users/ykoseali/events{/privacy}",
"received_events_url": "https://api.github.com/users/ykoseali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37467/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37467/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37466
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37466/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37466/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37466/events
|
https://github.com/huggingface/transformers/pull/37466
| 2,990,537,491
|
PR_kwDOCUB6oc6SW4k8
| 37,466
|
Fixed broken links
|
{
"login": "cypherpepe",
"id": 125112044,
"node_id": "U_kgDOB3UO7A",
"avatar_url": "https://avatars.githubusercontent.com/u/125112044?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cypherpepe",
"html_url": "https://github.com/cypherpepe",
"followers_url": "https://api.github.com/users/cypherpepe/followers",
"following_url": "https://api.github.com/users/cypherpepe/following{/other_user}",
"gists_url": "https://api.github.com/users/cypherpepe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cypherpepe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cypherpepe/subscriptions",
"organizations_url": "https://api.github.com/users/cypherpepe/orgs",
"repos_url": "https://api.github.com/users/cypherpepe/repos",
"events_url": "https://api.github.com/users/cypherpepe/events{/privacy}",
"received_events_url": "https://api.github.com/users/cypherpepe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T15:43:27
| 2025-04-14T13:16:07
| 2025-04-14T13:16:07
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37466",
"html_url": "https://github.com/huggingface/transformers/pull/37466",
"diff_url": "https://github.com/huggingface/transformers/pull/37466.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37466.patch",
"merged_at": "2025-04-14T13:16:07"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
HI! I fixed broken links to the `convert-hf-to-gguf.py` script in the Arabic and Korean documentation files. The script was renamed from `convert-hf-to-gguf.py` to `convert_hf_to_gguf.py` in the llama.cpp repository.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Documentation: @stevhliu
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37466/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37466/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37465
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37465/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37465/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37465/events
|
https://github.com/huggingface/transformers/issues/37465
| 2,990,502,092
|
I_kwDOCUB6oc6yP3DM
| 37,465
|
support flash-attn feature in llama4
|
{
"login": "gxm651182644",
"id": 7248859,
"node_id": "MDQ6VXNlcjcyNDg4NTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/7248859?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gxm651182644",
"html_url": "https://github.com/gxm651182644",
"followers_url": "https://api.github.com/users/gxm651182644/followers",
"following_url": "https://api.github.com/users/gxm651182644/following{/other_user}",
"gists_url": "https://api.github.com/users/gxm651182644/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gxm651182644/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gxm651182644/subscriptions",
"organizations_url": "https://api.github.com/users/gxm651182644/orgs",
"repos_url": "https://api.github.com/users/gxm651182644/repos",
"events_url": "https://api.github.com/users/gxm651182644/events{/privacy}",
"received_events_url": "https://api.github.com/users/gxm651182644/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-12T14:28:25
| 2025-06-24T12:51:50
| null |
NONE
| null | null | null | null |
### Feature request
Llama4 do not support flash_attn, leading huge GPU MEM consumption.
### Motivation
request for high training efficiency.
### Your contribution
No
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37465/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37465/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37464
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37464/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37464/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37464/events
|
https://github.com/huggingface/transformers/issues/37464
| 2,990,455,184
|
I_kwDOCUB6oc6yPrmQ
| 37,464
|
Broken phi4 model
|
{
"login": "JohnConnor123",
"id": 106041597,
"node_id": "U_kgDOBlIQ_Q",
"avatar_url": "https://avatars.githubusercontent.com/u/106041597?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JohnConnor123",
"html_url": "https://github.com/JohnConnor123",
"followers_url": "https://api.github.com/users/JohnConnor123/followers",
"following_url": "https://api.github.com/users/JohnConnor123/following{/other_user}",
"gists_url": "https://api.github.com/users/JohnConnor123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JohnConnor123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JohnConnor123/subscriptions",
"organizations_url": "https://api.github.com/users/JohnConnor123/orgs",
"repos_url": "https://api.github.com/users/JohnConnor123/repos",
"events_url": "https://api.github.com/users/JohnConnor123/events{/privacy}",
"received_events_url": "https://api.github.com/users/JohnConnor123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T12:44:05
| 2025-05-21T08:02:39
| 2025-05-21T08:02:38
|
NONE
| null | null | null | null |
### System Info
Maybe the Phi-4 tokenizer conversion is broken at transformers side.
More info with images here: https://github.com/vllm-project/vllm/issues/16510
### Who can help?
@ArthurZucker @itazap
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
All info here: https://github.com/vllm-project/vllm/issues/16510
### Expected behavior
Anything but that:

|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37464/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37464/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37463
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37463/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37463/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37463/events
|
https://github.com/huggingface/transformers/pull/37463
| 2,990,398,347
|
PR_kwDOCUB6oc6SWdfz
| 37,463
|
Fixing gated repo issues
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T10:36:09
| 2025-04-14T15:19:13
| 2025-04-14T15:19:11
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37463",
"html_url": "https://github.com/huggingface/transformers/pull/37463",
"diff_url": "https://github.com/huggingface/transformers/pull/37463.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37463.patch",
"merged_at": "2025-04-14T15:19:11"
}
|
# What does this PR do?
Using unsloth model alternative for gated repos in quark quantization, instead of `require_read_token`.
https://github.com/huggingface/transformers/actions/runs/14415368554/job/40431139879
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37463/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37463/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37462
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37462/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37462/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37462/events
|
https://github.com/huggingface/transformers/pull/37462
| 2,990,231,401
|
PR_kwDOCUB6oc6SV9k4
| 37,462
|
fix: (llama4) fix no_split_modules to be picked up for fsdpv1 and v2 sharding
|
{
"login": "kmehant",
"id": 15800200,
"node_id": "MDQ6VXNlcjE1ODAwMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/15800200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kmehant",
"html_url": "https://github.com/kmehant",
"followers_url": "https://api.github.com/users/kmehant/followers",
"following_url": "https://api.github.com/users/kmehant/following{/other_user}",
"gists_url": "https://api.github.com/users/kmehant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kmehant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmehant/subscriptions",
"organizations_url": "https://api.github.com/users/kmehant/orgs",
"repos_url": "https://api.github.com/users/kmehant/repos",
"events_url": "https://api.github.com/users/kmehant/events{/privacy}",
"received_events_url": "https://api.github.com/users/kmehant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T05:27:22
| 2025-04-14T08:44:33
| 2025-04-14T08:44:33
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37462",
"html_url": "https://github.com/huggingface/transformers/pull/37462",
"diff_url": "https://github.com/huggingface/transformers/pull/37462.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37462.patch",
"merged_at": "2025-04-14T08:44:33"
}
|
# What does this PR do?
Currently, FSDP sharding both v1 and v2 would fail to automatically infer which modules to wrap since, the current configuring of no_split_modules for llama4 makes it inaccessible. We should essentially move the `_no_split_modules` to common base class i.e. `Llama4PreTrainedModel`.
Though we can get around this by explicitly passing the classes to wrap, for downstream users who rely on automatic wrapping inference (traditionally working good feature of transformers 😉) would experience bad training memory costs for llama4 + FSDP and wont realize.
While using fsdp activation checkpointing, there are no workarounds.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
- text models: @ArthurZucker
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- trainer: @zach-huggingface and @SunMarc
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37462/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37462/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37461
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37461/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37461/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37461/events
|
https://github.com/huggingface/transformers/issues/37461
| 2,990,102,231
|
I_kwDOCUB6oc6yOVbX
| 37,461
|
Convnext image preprocessor raises an AssertionError when comparing logits
|
{
"login": "chandrusuresh",
"id": 6626250,
"node_id": "MDQ6VXNlcjY2MjYyNTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6626250?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chandrusuresh",
"html_url": "https://github.com/chandrusuresh",
"followers_url": "https://api.github.com/users/chandrusuresh/followers",
"following_url": "https://api.github.com/users/chandrusuresh/following{/other_user}",
"gists_url": "https://api.github.com/users/chandrusuresh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chandrusuresh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chandrusuresh/subscriptions",
"organizations_url": "https://api.github.com/users/chandrusuresh/orgs",
"repos_url": "https://api.github.com/users/chandrusuresh/repos",
"events_url": "https://api.github.com/users/chandrusuresh/events{/privacy}",
"received_events_url": "https://api.github.com/users/chandrusuresh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-12T02:14:43
| 2025-06-19T08:03:22
| 2025-06-19T08:03:22
|
NONE
| null | null | null | null |
### System Info
Convnext image preprocessor raises an assert when checking the logits computed from image preprocessor against the expected logits [here](https://github.com/huggingface/transformers/blob/953196a43dae6a3c474165fba7d215fcbc7b7730/src/transformers/models/convnext/convert_convnext_to_pytorch.py#L187).
Are the expected logits correct? What's the source of these values?
Note: I found the bug in the context of working on a PR for #28180 in #37460.
### Who can help?
@NielsRogge
### Reproduction
To reproduce this, simply run: `python3.12 src/transformers/models/convnext/convert_convnext_to_pytorch.py` with appropriate arguments and this will raise an AssertionError as described above.
### Expected behavior
All asserts should pass and the code should run without issues.
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37461/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37461/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37460
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37460/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37460/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37460/events
|
https://github.com/huggingface/transformers/pull/37460
| 2,989,682,021
|
PR_kwDOCUB6oc6SUEI7
| 37,460
|
Fix interpolation of convnext image processor
|
{
"login": "chandrusuresh",
"id": 6626250,
"node_id": "MDQ6VXNlcjY2MjYyNTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6626250?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chandrusuresh",
"html_url": "https://github.com/chandrusuresh",
"followers_url": "https://api.github.com/users/chandrusuresh/followers",
"following_url": "https://api.github.com/users/chandrusuresh/following{/other_user}",
"gists_url": "https://api.github.com/users/chandrusuresh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chandrusuresh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chandrusuresh/subscriptions",
"organizations_url": "https://api.github.com/users/chandrusuresh/orgs",
"repos_url": "https://api.github.com/users/chandrusuresh/repos",
"events_url": "https://api.github.com/users/chandrusuresh/events{/privacy}",
"received_events_url": "https://api.github.com/users/chandrusuresh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-11T20:37:44
| 2025-07-07T12:08:22
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37460",
"html_url": "https://github.com/huggingface/transformers/pull/37460",
"diff_url": "https://github.com/huggingface/transformers/pull/37460.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37460.patch",
"merged_at": null
}
|
# What does this PR do?
Image processors do not have the same interpolation method used in the actual models (reference to actual models: timm). This PR fixes:
- the interpolation method for ConvNext
- updates the conversion script to use timm models instead of downloading checkpoints from url. This was necessary to overcome the AssertionError thrown from comparing the logits to the hard-coded logits from the checkpoint url and fixes the bug in #37461
- Convnext model (in timm) seems to use `IMAGENET_DEFAULT_MEAN/STD` to normalize image. But the default value set in `ConvNextImageProcessor` and `ConvNextImageProcessorFast` are `IMAGENET_STANDARD_MEAN/STD`. This has been fixed in this PR.
For the 2nd point above, the following timm models were tested to pass all assertions in the script. The following table summarizes the equivalence between the timm models and the checkpoint urls in the original script.
### Some noteworthy changes
(!) Default interpolator was updated on `ConvNextImageProcessorFast`. But `timm_pixel_values` do not match and raise an AssertionError. This is outside the scope of this PR.
| Checkpoint URL | TIMM model name |
| ----------- | ----------- |
| [convnext_tiny_1k_224_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth) | [convnext_tiny.fb_in1k](https://huggingface.co/timm/convnext_tiny.fb_in1k) |
| [convnext_small_1k_224_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth) | [convnext_small.fb_in1k](https://huggingface.co/timm/convnext_small.fb_in1k) |
| [convnext_base_1k_224_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth) | [convnext_base.fb_in1k](https://huggingface.co/timm/convnext_base.fb_in1k) |
| [convnext_base_1k_384.pth](https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth) | Not available |
| [convnext_large_1k_224_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth) | [convnext_large.fb_in1k](https://huggingface.co/timm/convnext_large.fb_in1k) |
| [convnext_large_1k_384.pth](https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth) | Not available |
| [convnext_base_22k_224.pth](https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth) | [convnext_base.fb_in22k](https://huggingface.co/timm/convnext_base.fb_in22k) |
| [convnext_large_22k_224.pth](https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth) | [convnext_large.fb_in22k](https://huggingface.co/timm/convnext_large.fb_in22k) |
| [convnext_xlarge_22k_224.pth](https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth) | [convnext_xlarge.fb_in22k](https://huggingface.co/timm/convnext_xlarge.fb_in22k) |
| [convnext_base_22k_1k_224.pth](https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth) | [convnext_base.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k) |
| [convnext_base_22k_1k_384.pth](https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth) | [convnext_base.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_base.fb_in22k_ft_in1k_384) |
| [convnext_large_22k_1k_224.pth](https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth) | [convnext_large.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k) |
| [convnext_large_22k_1k_384.pth](https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth) | [convnext_large.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_large.fb_in22k_ft_in1k_384) |
| [convnext_xlarge_22k_1k_224_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth) | [convnext_xlarge.fb_in22k_ft_in1k](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k) |
| [convnext_xlarge_22k_1k_384_ema.pth](https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth) | [convnext_xlarge.fb_in22k_ft_in1k_384](https://huggingface.co/timm/convnext_xlarge.fb_in22k_ft_in1k_384) |
The following were not tested:
- `model.save_pretrained` and `image_processor.save_pretrained`
- `model.push_to_hub`
Fixes #28180 , #37461
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@NielsRogge @amyeroberts
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37460/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37460/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37459
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37459/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37459/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37459/events
|
https://github.com/huggingface/transformers/issues/37459
| 2,989,516,220
|
I_kwDOCUB6oc6yMGW8
| 37,459
|
RuntimeError: Failed to import transformers.models.bert.modeling_bert
|
{
"login": "JaehyunsLee",
"id": 51414823,
"node_id": "MDQ6VXNlcjUxNDE0ODIz",
"avatar_url": "https://avatars.githubusercontent.com/u/51414823?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JaehyunsLee",
"html_url": "https://github.com/JaehyunsLee",
"followers_url": "https://api.github.com/users/JaehyunsLee/followers",
"following_url": "https://api.github.com/users/JaehyunsLee/following{/other_user}",
"gists_url": "https://api.github.com/users/JaehyunsLee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JaehyunsLee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JaehyunsLee/subscriptions",
"organizations_url": "https://api.github.com/users/JaehyunsLee/orgs",
"repos_url": "https://api.github.com/users/JaehyunsLee/repos",
"events_url": "https://api.github.com/users/JaehyunsLee/events{/privacy}",
"received_events_url": "https://api.github.com/users/JaehyunsLee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T19:00:59
| 2025-04-14T12:59:23
| 2025-04-14T12:59:22
|
NONE
| null | null | null | null |
### System Info
sentence-transformers==3.4.1
spacy-transformers==1.2.4
transformers==4.51.2
numpy==2.0.2
pandas==2.2.3
polars==0.19.19
Python 3.9.14
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. Type `python `on terminal
2. Type `from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig`
### Expected behavior
Produces the following error:
```
RuntimeError: Failed to import transformers.models.auto.modeling_auto because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject
```
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37459/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37459/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37458
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37458/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37458/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37458/events
|
https://github.com/huggingface/transformers/issues/37458
| 2,989,427,803
|
I_kwDOCUB6oc6yLwxb
| 37,458
|
Segmentation Fault
|
{
"login": "lordsoffallen",
"id": 20232088,
"node_id": "MDQ6VXNlcjIwMjMyMDg4",
"avatar_url": "https://avatars.githubusercontent.com/u/20232088?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lordsoffallen",
"html_url": "https://github.com/lordsoffallen",
"followers_url": "https://api.github.com/users/lordsoffallen/followers",
"following_url": "https://api.github.com/users/lordsoffallen/following{/other_user}",
"gists_url": "https://api.github.com/users/lordsoffallen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lordsoffallen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lordsoffallen/subscriptions",
"organizations_url": "https://api.github.com/users/lordsoffallen/orgs",
"repos_url": "https://api.github.com/users/lordsoffallen/repos",
"events_url": "https://api.github.com/users/lordsoffallen/events{/privacy}",
"received_events_url": "https://api.github.com/users/lordsoffallen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T18:26:41
| 2025-04-14T15:27:00
| 2025-04-14T15:26:59
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.48.3
- Platform: Linux-5.15.0-76-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.4.5
- Accelerate version: 1.2.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.5.1+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA A100 80GB PCIe
### Who can help?
@Rocketknight1
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
depth = pipeline(
task="depth-estimation",
model="depth-anything/Depth-Anything-V2-Large-hf",
)
Returns segmentation fault( core dumped) error.
### Expected behavior
Expected to load the model
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37458/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37458/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37457
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37457/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37457/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37457/events
|
https://github.com/huggingface/transformers/pull/37457
| 2,989,157,838
|
PR_kwDOCUB6oc6SSTQd
| 37,457
|
Remove torchvision requirement from AutoImageProcessor
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T16:34:44
| 2025-04-21T12:59:35
| 2025-04-21T12:59:33
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37457",
"html_url": "https://github.com/huggingface/transformers/pull/37457",
"diff_url": "https://github.com/huggingface/transformers/pull/37457.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37457.patch",
"merged_at": "2025-04-21T12:59:33"
}
| null |
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37457/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37457/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37456
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37456/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37456/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37456/events
|
https://github.com/huggingface/transformers/pull/37456
| 2,988,797,976
|
PR_kwDOCUB6oc6SREJm
| 37,456
|
Update check_modular_conversion
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 7553455074,
"node_id": "LA_kwDOCUB6oc8AAAABwjiT4g",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Modular",
"name": "Modular",
"color": "3DEDD2",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T14:14:32
| 2025-07-10T18:07:59
| 2025-07-10T18:07:59
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37456",
"html_url": "https://github.com/huggingface/transformers/pull/37456",
"diff_url": "https://github.com/huggingface/transformers/pull/37456.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37456.patch",
"merged_at": "2025-07-10T18:07:59"
}
|
# What does this PR do?
- Adds multiprocessing for processing modular files (with and without fix_and_overwrite flag)
- While checking, we always should overwrite files to be sure we did not miss any conversion
cc @ydshieh
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37456/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37456/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37455
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37455/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37455/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37455/events
|
https://github.com/huggingface/transformers/pull/37455
| 2,988,788,495
|
PR_kwDOCUB6oc6SRCEM
| 37,455
|
Delete hubconf.py
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T14:11:22
| 2025-04-11T17:12:47
| 2025-04-11T17:12:45
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37455",
"html_url": "https://github.com/huggingface/transformers/pull/37455",
"diff_url": "https://github.com/huggingface/transformers/pull/37455.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37455.patch",
"merged_at": "2025-04-11T17:12:45"
}
|
Remove an old file that's no longer in use
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37455/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37455/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37454
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37454/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37454/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37454/events
|
https://github.com/huggingface/transformers/pull/37454
| 2,988,381,046
|
PR_kwDOCUB6oc6SPoR2
| 37,454
|
Add print_stack() in isin_mps_friendly() of pytorch_utils.py(#37423)
|
{
"login": "f2janyway",
"id": 55625423,
"node_id": "MDQ6VXNlcjU1NjI1NDIz",
"avatar_url": "https://avatars.githubusercontent.com/u/55625423?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/f2janyway",
"html_url": "https://github.com/f2janyway",
"followers_url": "https://api.github.com/users/f2janyway/followers",
"following_url": "https://api.github.com/users/f2janyway/following{/other_user}",
"gists_url": "https://api.github.com/users/f2janyway/gists{/gist_id}",
"starred_url": "https://api.github.com/users/f2janyway/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/f2janyway/subscriptions",
"organizations_url": "https://api.github.com/users/f2janyway/orgs",
"repos_url": "https://api.github.com/users/f2janyway/repos",
"events_url": "https://api.github.com/users/f2janyway/events{/privacy}",
"received_events_url": "https://api.github.com/users/f2janyway/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T11:24:23
| 2025-04-11T11:29:07
| 2025-04-11T11:29:07
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37454",
"html_url": "https://github.com/huggingface/transformers/pull/37454",
"diff_url": "https://github.com/huggingface/transformers/pull/37454.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37454.patch",
"merged_at": null
}
|
@manueldeprada
|
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37454/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37454/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37453
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37453/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37453/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37453/events
|
https://github.com/huggingface/transformers/pull/37453
| 2,988,367,688
|
PR_kwDOCUB6oc6SPlQS
| 37,453
|
feat: Add gradient testing for Flash Attention 2
|
{
"login": "crStiv",
"id": 189026468,
"node_id": "U_kgDOC0RQpA",
"avatar_url": "https://avatars.githubusercontent.com/u/189026468?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/crStiv",
"html_url": "https://github.com/crStiv",
"followers_url": "https://api.github.com/users/crStiv/followers",
"following_url": "https://api.github.com/users/crStiv/following{/other_user}",
"gists_url": "https://api.github.com/users/crStiv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/crStiv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/crStiv/subscriptions",
"organizations_url": "https://api.github.com/users/crStiv/orgs",
"repos_url": "https://api.github.com/users/crStiv/repos",
"events_url": "https://api.github.com/users/crStiv/events{/privacy}",
"received_events_url": "https://api.github.com/users/crStiv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-11T11:19:27
| 2025-06-24T12:53:24
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37453",
"html_url": "https://github.com/huggingface/transformers/pull/37453",
"diff_url": "https://github.com/huggingface/transformers/pull/37453.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37453.patch",
"merged_at": null
}
|
reopen #35780
Add test to ensure gradients computed with Flash Attention 2 match eager implementation within specified tolerances. Key changes:
- Add test_flash_attention_2_gradients method to ModelTesterMixin
- Compare gradients between eager and FA2 implementations
- Use same model weights and seeds for reproducible results
- Test in train mode with proper gradient computation
- Follow review feedback to use config._attn_implementation
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37453/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37453/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37452
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37452/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37452/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37452/events
|
https://github.com/huggingface/transformers/pull/37452
| 2,988,220,759
|
PR_kwDOCUB6oc6SPE7m
| 37,452
|
Fix the test fetcher
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T10:13:57
| 2025-04-11T10:40:40
| 2025-04-11T10:19:27
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37452",
"html_url": "https://github.com/huggingface/transformers/pull/37452",
"diff_url": "https://github.com/huggingface/transformers/pull/37452.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37452.patch",
"merged_at": "2025-04-11T10:19:27"
}
|
The list of tests to fetch became an infinitely growing list with the new init approach; this instead converts it to a dict, and completes the list of importable objects of each key in the dict.
If the key wasn't here previously, it adds it with the list of objects; otherwise, it checks whether the list is incomplete, and completes it.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37452/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37452/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37451
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37451/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37451/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37451/events
|
https://github.com/huggingface/transformers/issues/37451
| 2,988,214,004
|
I_kwDOCUB6oc6yHIb0
| 37,451
|
[Llama 4] `offloaded_hybrid` fails on main w/ `torch._dynamo.exc.BackendCompilerFailed`
|
{
"login": "Vaibhavs10",
"id": 18682411,
"node_id": "MDQ6VXNlcjE4NjgyNDEx",
"avatar_url": "https://avatars.githubusercontent.com/u/18682411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Vaibhavs10",
"html_url": "https://github.com/Vaibhavs10",
"followers_url": "https://api.github.com/users/Vaibhavs10/followers",
"following_url": "https://api.github.com/users/Vaibhavs10/following{/other_user}",
"gists_url": "https://api.github.com/users/Vaibhavs10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Vaibhavs10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Vaibhavs10/subscriptions",
"organizations_url": "https://api.github.com/users/Vaibhavs10/orgs",
"repos_url": "https://api.github.com/users/Vaibhavs10/repos",
"events_url": "https://api.github.com/users/Vaibhavs10/events{/privacy}",
"received_events_url": "https://api.github.com/users/Vaibhavs10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T10:11:03
| 2025-04-14T10:00:38
| 2025-04-14T10:00:36
|
MEMBER
| null | null | null | null |
### System Info
- `transformers` version: 4.52.0.dev0
- Platform: Linux-6.8.0-1024-aws-x86_64-with-glibc2.39
- Python version: 3.11.11
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: yes
- Using GPU in script?: yes
- GPU type: NVIDIA H200
### Who can help?
@Cyrilvallez @ArthurZucker
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Here's the test script: https://gist.github.com/Vaibhavs10/f7d5bcc45587daad2645870d739a73b1
Here's the test prompt: https://huggingface.co/reach-vb/random-files/blob/main/very_long_context_prompt.txt
### Expected behavior
It should work!
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37451/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37451/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37450
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37450/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37450/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37450/events
|
https://github.com/huggingface/transformers/pull/37450
| 2,988,146,249
|
PR_kwDOCUB6oc6SO0Yz
| 37,450
|
Set should_evaluate=eval_on_start on train start
|
{
"login": "I-l-l-I",
"id": 56996119,
"node_id": "MDQ6VXNlcjU2OTk2MTE5",
"avatar_url": "https://avatars.githubusercontent.com/u/56996119?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/I-l-l-I",
"html_url": "https://github.com/I-l-l-I",
"followers_url": "https://api.github.com/users/I-l-l-I/followers",
"following_url": "https://api.github.com/users/I-l-l-I/following{/other_user}",
"gists_url": "https://api.github.com/users/I-l-l-I/gists{/gist_id}",
"starred_url": "https://api.github.com/users/I-l-l-I/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/I-l-l-I/subscriptions",
"organizations_url": "https://api.github.com/users/I-l-l-I/orgs",
"repos_url": "https://api.github.com/users/I-l-l-I/repos",
"events_url": "https://api.github.com/users/I-l-l-I/events{/privacy}",
"received_events_url": "https://api.github.com/users/I-l-l-I/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T09:47:32
| 2025-04-18T16:17:54
| 2025-04-18T16:16:58
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37450",
"html_url": "https://github.com/huggingface/transformers/pull/37450",
"diff_url": "https://github.com/huggingface/transformers/pull/37450.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37450.patch",
"merged_at": null
}
|
# What does this PR do?
Set `control.should_evaluate=True` when `eval_on_start=True` on train start.
### Motivation
Currently if `eval_on_start=True`, `control.should_evaluate` stays false during the first evaluation, which doesn't make sense.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Who can review?
@zach-huggingface and @SunMarc
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "I-l-l-I",
"id": 56996119,
"node_id": "MDQ6VXNlcjU2OTk2MTE5",
"avatar_url": "https://avatars.githubusercontent.com/u/56996119?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/I-l-l-I",
"html_url": "https://github.com/I-l-l-I",
"followers_url": "https://api.github.com/users/I-l-l-I/followers",
"following_url": "https://api.github.com/users/I-l-l-I/following{/other_user}",
"gists_url": "https://api.github.com/users/I-l-l-I/gists{/gist_id}",
"starred_url": "https://api.github.com/users/I-l-l-I/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/I-l-l-I/subscriptions",
"organizations_url": "https://api.github.com/users/I-l-l-I/orgs",
"repos_url": "https://api.github.com/users/I-l-l-I/repos",
"events_url": "https://api.github.com/users/I-l-l-I/events{/privacy}",
"received_events_url": "https://api.github.com/users/I-l-l-I/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37450/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37450/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37449
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37449/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37449/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37449/events
|
https://github.com/huggingface/transformers/pull/37449
| 2,988,112,009
|
PR_kwDOCUB6oc6SOtJx
| 37,449
|
Remove triton mlp kernel, not compiling for some models
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T09:33:18
| 2025-09-23T08:35:32
| 2025-04-11T10:47:13
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37449",
"html_url": "https://github.com/huggingface/transformers/pull/37449",
"diff_url": "https://github.com/huggingface/transformers/pull/37449.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37449.patch",
"merged_at": "2025-04-11T10:47:13"
}
|
# What does this PR do?
Removing the mlp triton kernel because of some failures on the CI, and disabling the use of kernels on the CI
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37449/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37449/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37448
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37448/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37448/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37448/events
|
https://github.com/huggingface/transformers/pull/37448
| 2,988,074,326
|
PR_kwDOCUB6oc6SOlQR
| 37,448
|
Update-kernel-pin
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T09:17:37
| 2025-04-11T09:44:10
| 2025-04-11T09:19:21
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37448",
"html_url": "https://github.com/huggingface/transformers/pull/37448",
"diff_url": "https://github.com/huggingface/transformers/pull/37448.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37448.patch",
"merged_at": "2025-04-11T09:19:21"
}
|
# What does this PR do?
Make sur backward passes are fine
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37448/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37448/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37447
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37447/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37447/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37447/events
|
https://github.com/huggingface/transformers/pull/37447
| 2,988,073,162
|
PR_kwDOCUB6oc6SOlA3
| 37,447
|
[Gemma3] compile ✨
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T09:17:07
| 2025-04-25T01:14:47
| 2025-04-18T13:55:43
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37447",
"html_url": "https://github.com/huggingface/transformers/pull/37447",
"diff_url": "https://github.com/huggingface/transformers/pull/37447.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37447.patch",
"merged_at": "2025-04-18T13:55:43"
}
|
# What does this PR do?
Enables compilation on Gemma3 (and re-enables it on Gemma2 / Cohere2).
Reverts #36620
Supercedes #37433 (solves the same problem, but this PR is much cleaner)
### Performance
Measured on an RTX4090, excluding compile warmup time:
- [Example script from gemma 3 4b](https://huggingface.co/google/gemma-3-4b-it#running-the-model-on-a-singlemulti-gpu): 3.87s on `main` (not compiled) -> 2.39s this PR
- [Example script from gemma 2 9b](https://huggingface.co/google/gemma-2-9b-it#running-the-model-on-a-single--multi-gpu): 3.67s on `main` (not compiled) -> 2.18s this PR
### Tests
- [x] slow gemma 2 tests (9 failing tests from `main` -> need to be revisited)
- [x] slow gemma 3 tests (2 failing tests from `main`, `tests/models/gemma3/test_modeling_gemma3.py::Gemma3Vision2TextModelTest::test_eager_matches_sdpa_generate` gets fixed in this PR)
### Post-mortem: How did we break compile on Gemma 2?
1. Doing `git bisect`, compilation first "breaks" in the PR where the cache is initialized in the `meta` device (https://github.com/huggingface/transformers/pull/35164). "break" here doesn't mean "crash", but rather "becomes very slow". Curiously, this change doesn't slow down `StaticCache` + `llama` (why?), so it flew under the radar when we benchmarked before merging. Nevertheless, this specific PR has been reverted (https://github.com/huggingface/transformers/pull/36543).
2. Along the way, we corrected how the sliding window attention works, by slicing the attention mask correctly (https://github.com/huggingface/transformers/pull/35681). However, the solution here is not `torch.compile` friendly: `forward` now has an `int` argument that is different at each forward pass at generation time, causing recompilation ([reference](https://pytorch.org/docs/stable/torch.compiler_troubleshooting.html#wrapping-constants-with-tensors)). The changes in this PR work around this issue.
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37447/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37447/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37446
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37446/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37446/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37446/events
|
https://github.com/huggingface/transformers/pull/37446
| 2,988,016,217
|
PR_kwDOCUB6oc6SOY_o
| 37,446
|
Disable kernels for quantization
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T08:54:26
| 2025-04-11T14:35:40
| 2025-04-11T14:35:39
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37446",
"html_url": "https://github.com/huggingface/transformers/pull/37446",
"diff_url": "https://github.com/huggingface/transformers/pull/37446.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37446.patch",
"merged_at": "2025-04-11T14:35:38"
}
|
# What does this PR do?
Disabling kernels for quantization tests using the flag from this pr https://github.com/huggingface/kernels/pull/70, since the linears are different so we need different kernels in that case
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37446/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37446/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37445
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37445/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37445/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37445/events
|
https://github.com/huggingface/transformers/pull/37445
| 2,987,752,077
|
PR_kwDOCUB6oc6SNgQH
| 37,445
|
remove _run_third_party_device_tests
|
{
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T06:58:51
| 2025-07-02T05:22:21
| 2025-04-18T09:19:56
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37445",
"html_url": "https://github.com/huggingface/transformers/pull/37445",
"diff_url": "https://github.com/huggingface/transformers/pull/37445.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37445.patch",
"merged_at": "2025-04-18T09:19:56"
}
|
Hi @SunMarc . I ran the transformers tests for XPU and found that the `_run_third_party_device_tests` blocks me from setting XPU as the device unless I set the parameter. I cannot see any reason to keep this parameter. There is no limitation for CUDA, so I suppose we should not have limitations for other devices like XPU.
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37445/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37445/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37444
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37444/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37444/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37444/events
|
https://github.com/huggingface/transformers/pull/37444
| 2,987,730,588
|
PR_kwDOCUB6oc6SNbn5
| 37,444
|
Fix Aria tests
|
{
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T06:49:45
| 2025-07-02T05:22:24
| 2025-04-24T08:51:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37444",
"html_url": "https://github.com/huggingface/transformers/pull/37444",
"diff_url": "https://github.com/huggingface/transformers/pull/37444.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37444.patch",
"merged_at": "2025-04-24T08:51:30"
}
|
Reproduce:
`TRANSFORMERS_TEST_DEVICE=cuda RUN_SLOW=1 pytest -rA tests/models/aria/test_modeling_aria.py::AriaForConditionalGenerationIntegrationTest::test_batched_generation`
Error log:
```
FAILED tests/models/aria/test_modeling_aria.py::AriaForConditionalGenerationIntegrationTest::test_batched_generation - RuntimeError: INDICES element is out of DATA bounds, id=100352 axis_dim=100352
```
I found 4 issues in this case:
1. The pad_token_id is the same as vocab_size which will cause out of bounds. It cames from both [config.json](https://huggingface.co/rhymes-ai/Aria/blob/main/config.json) and [added_tokens.json](https://huggingface.co/rhymes-ai/Aria/blob/main/added_tokens.json) have pad token.
2. The input image and input prompt should have same batch size
3. The input pixel value shoule be cast to the model.dtype, see the [model hub usage](https://huggingface.co/rhymes-ai/Aria#inference)
4. This model uses torch MHA, which will read the weight directly and apply torch.matmul without dequantizing the weights. We should skip quantizing the MHA weight.
After fixing the 4 issues, the test can correctly run.
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37444/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37444/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37443
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37443/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37443/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37443/events
|
https://github.com/huggingface/transformers/pull/37443
| 2,987,591,784
|
PR_kwDOCUB6oc6SM9tq
| 37,443
|
[bug] deprecated deta load_cuda_kernel, MultiScaleDeformableAttention
|
{
"login": "chagmgang",
"id": 37325825,
"node_id": "MDQ6VXNlcjM3MzI1ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/37325825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chagmgang",
"html_url": "https://github.com/chagmgang",
"followers_url": "https://api.github.com/users/chagmgang/followers",
"following_url": "https://api.github.com/users/chagmgang/following{/other_user}",
"gists_url": "https://api.github.com/users/chagmgang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chagmgang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chagmgang/subscriptions",
"organizations_url": "https://api.github.com/users/chagmgang/orgs",
"repos_url": "https://api.github.com/users/chagmgang/repos",
"events_url": "https://api.github.com/users/chagmgang/events{/privacy}",
"received_events_url": "https://api.github.com/users/chagmgang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T05:33:00
| 2025-04-14T14:44:30
| 2025-04-14T14:44:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37443",
"html_url": "https://github.com/huggingface/transformers/pull/37443",
"diff_url": "https://github.com/huggingface/transformers/pull/37443.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37443.patch",
"merged_at": "2025-04-14T14:44:30"
}
|
# What does this PR do?
* load_cuda_kernel unavailable path
https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/deta/modeling_deta.py#L60
* MultiScaleDeformableAttention is not initialized
https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/deta/modeling_deta.py#L70
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37443/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37443/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37442
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37442/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37442/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37442/events
|
https://github.com/huggingface/transformers/pull/37442
| 2,987,500,791
|
PR_kwDOCUB6oc6SMqbX
| 37,442
|
Implemented update function in cache_utils.py, with a test file test_cache_utils.py
|
{
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-11T04:14:44
| 2025-04-22T16:41:26
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37442",
"html_url": "https://github.com/huggingface/transformers/pull/37442",
"diff_url": "https://github.com/huggingface/transformers/pull/37442.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37442.patch",
"merged_at": null
}
|
# What does this PR do?
Implemented update function in cache_utils.py, with a test file test_cache_utils.py
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)https://github.com/huggingface/transformers/issues/37078
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/37078
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37442/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37442/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37441
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37441/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37441/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37441/events
|
https://github.com/huggingface/transformers/pull/37441
| 2,987,491,341
|
PR_kwDOCUB6oc6SMoaw
| 37,441
|
Implemented update function in cache_utils.py, with a test file test_cache_utils.py
|
{
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T04:06:08
| 2025-04-11T04:13:40
| 2025-04-11T04:13:40
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37441",
"html_url": "https://github.com/huggingface/transformers/pull/37441",
"diff_url": "https://github.com/huggingface/transformers/pull/37441.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37441.patch",
"merged_at": null
}
|
# What does this PR do?
Implemented update function in cache_utils.py, with a test file test_cache_utils.py
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)https://github.com/huggingface/transformers/issues/37078
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/37078
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ailunc",
"id": 131329865,
"node_id": "U_kgDOB9PvSQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131329865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ailunc",
"html_url": "https://github.com/ailunc",
"followers_url": "https://api.github.com/users/ailunc/followers",
"following_url": "https://api.github.com/users/ailunc/following{/other_user}",
"gists_url": "https://api.github.com/users/ailunc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ailunc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ailunc/subscriptions",
"organizations_url": "https://api.github.com/users/ailunc/orgs",
"repos_url": "https://api.github.com/users/ailunc/repos",
"events_url": "https://api.github.com/users/ailunc/events{/privacy}",
"received_events_url": "https://api.github.com/users/ailunc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37441/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37441/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37440
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37440/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37440/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37440/events
|
https://github.com/huggingface/transformers/pull/37440
| 2,987,490,336
|
PR_kwDOCUB6oc6SMoM4
| 37,440
|
Update modeling_deta.py
|
{
"login": "chagmgang",
"id": 37325825,
"node_id": "MDQ6VXNlcjM3MzI1ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/37325825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chagmgang",
"html_url": "https://github.com/chagmgang",
"followers_url": "https://api.github.com/users/chagmgang/followers",
"following_url": "https://api.github.com/users/chagmgang/following{/other_user}",
"gists_url": "https://api.github.com/users/chagmgang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chagmgang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chagmgang/subscriptions",
"organizations_url": "https://api.github.com/users/chagmgang/orgs",
"repos_url": "https://api.github.com/users/chagmgang/repos",
"events_url": "https://api.github.com/users/chagmgang/events{/privacy}",
"received_events_url": "https://api.github.com/users/chagmgang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T04:05:18
| 2025-04-11T04:25:03
| 2025-04-11T04:25:03
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37440",
"html_url": "https://github.com/huggingface/transformers/pull/37440",
"diff_url": "https://github.com/huggingface/transformers/pull/37440.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37440.patch",
"merged_at": null
}
|
# What does this PR do?
* The deformable detr kernel is deprecated.
* The kernel is build good location. However, the load_cuda_kernel look another location.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "chagmgang",
"id": 37325825,
"node_id": "MDQ6VXNlcjM3MzI1ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/37325825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chagmgang",
"html_url": "https://github.com/chagmgang",
"followers_url": "https://api.github.com/users/chagmgang/followers",
"following_url": "https://api.github.com/users/chagmgang/following{/other_user}",
"gists_url": "https://api.github.com/users/chagmgang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chagmgang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chagmgang/subscriptions",
"organizations_url": "https://api.github.com/users/chagmgang/orgs",
"repos_url": "https://api.github.com/users/chagmgang/repos",
"events_url": "https://api.github.com/users/chagmgang/events{/privacy}",
"received_events_url": "https://api.github.com/users/chagmgang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37440/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37440/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37439
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37439/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37439/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37439/events
|
https://github.com/huggingface/transformers/pull/37439
| 2,987,449,632
|
PR_kwDOCUB6oc6SMf7o
| 37,439
|
Update quantization docs
|
{
"login": "DerekLiu35",
"id": 91234588,
"node_id": "MDQ6VXNlcjkxMjM0NTg4",
"avatar_url": "https://avatars.githubusercontent.com/u/91234588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DerekLiu35",
"html_url": "https://github.com/DerekLiu35",
"followers_url": "https://api.github.com/users/DerekLiu35/followers",
"following_url": "https://api.github.com/users/DerekLiu35/following{/other_user}",
"gists_url": "https://api.github.com/users/DerekLiu35/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DerekLiu35/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DerekLiu35/subscriptions",
"organizations_url": "https://api.github.com/users/DerekLiu35/orgs",
"repos_url": "https://api.github.com/users/DerekLiu35/repos",
"events_url": "https://api.github.com/users/DerekLiu35/events{/privacy}",
"received_events_url": "https://api.github.com/users/DerekLiu35/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T03:26:02
| 2025-04-16T13:44:54
| 2025-04-16T13:44:54
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37439",
"html_url": "https://github.com/huggingface/transformers/pull/37439",
"diff_url": "https://github.com/huggingface/transformers/pull/37439.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37439.patch",
"merged_at": "2025-04-16T13:44:54"
}
|
Created draft of quantization concept guide
@SunMarc
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37439/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37439/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37438
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37438/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37438/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37438/events
|
https://github.com/huggingface/transformers/pull/37438
| 2,987,445,951
|
PR_kwDOCUB6oc6SMfKZ
| 37,438
|
Fixes: Corrects file path for CUDA kernels
|
{
"login": "DonggeunYu",
"id": 17740653,
"node_id": "MDQ6VXNlcjE3NzQwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/17740653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DonggeunYu",
"html_url": "https://github.com/DonggeunYu",
"followers_url": "https://api.github.com/users/DonggeunYu/followers",
"following_url": "https://api.github.com/users/DonggeunYu/following{/other_user}",
"gists_url": "https://api.github.com/users/DonggeunYu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DonggeunYu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DonggeunYu/subscriptions",
"organizations_url": "https://api.github.com/users/DonggeunYu/orgs",
"repos_url": "https://api.github.com/users/DonggeunYu/repos",
"events_url": "https://api.github.com/users/DonggeunYu/events{/privacy}",
"received_events_url": "https://api.github.com/users/DonggeunYu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T03:22:13
| 2025-04-11T08:41:47
| 2025-04-11T08:41:47
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37438",
"html_url": "https://github.com/huggingface/transformers/pull/37438",
"diff_url": "https://github.com/huggingface/transformers/pull/37438.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37438.patch",
"merged_at": "2025-04-11T08:41:47"
}
|
# What does this PR do?
Corrects the file path used to locate the CUDA kernels for the Deformable Attention module. This ensures that the kernels are loaded correctly, resolving potential errors during module initialization and usage.
## Who can review?
- vision models: @amyeroberts, @qubvel
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37438/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37438/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37437
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37437/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37437/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37437/events
|
https://github.com/huggingface/transformers/pull/37437
| 2,987,409,982
|
PR_kwDOCUB6oc6SMXmb
| 37,437
|
enhance require_deterministic_for_xpu
|
{
"login": "yao-matrix",
"id": 7245027,
"node_id": "MDQ6VXNlcjcyNDUwMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7245027?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yao-matrix",
"html_url": "https://github.com/yao-matrix",
"followers_url": "https://api.github.com/users/yao-matrix/followers",
"following_url": "https://api.github.com/users/yao-matrix/following{/other_user}",
"gists_url": "https://api.github.com/users/yao-matrix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yao-matrix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yao-matrix/subscriptions",
"organizations_url": "https://api.github.com/users/yao-matrix/orgs",
"repos_url": "https://api.github.com/users/yao-matrix/repos",
"events_url": "https://api.github.com/users/yao-matrix/events{/privacy}",
"received_events_url": "https://api.github.com/users/yao-matrix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T02:45:53
| 2025-04-11T06:22:22
| 2025-04-11T06:06:08
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37437",
"html_url": "https://github.com/huggingface/transformers/pull/37437",
"diff_url": "https://github.com/huggingface/transformers/pull/37437.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37437.patch",
"merged_at": "2025-04-11T06:06:08"
}
|
While revisiting this PR https://github.com/huggingface/transformers/pull/30774, we are for "still try to set it to deterministic for XPU (in the decorator body) and use it (despite you might get some failures at some points as I mentioned). You might get lucky" option from @ydshieh. XPU is seeking the same level of usability and coverage as CUDA. So, we need enable these numerical cases w/ deterministic enabled, if some still fail, we Intel guys need investigate.
I enhance the require_deterministic_for_xpu decorator, so if it's XPU, while enter the body, we will save the deterministic state and set it to True; while leaving, we will restore the prior deterministic state. So, we don't impact the global setting, just make it case specific.
@ydshieh , pls help review, thx.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37437/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37437/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37436
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37436/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37436/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37436/events
|
https://github.com/huggingface/transformers/issues/37436
| 2,987,312,512
|
I_kwDOCUB6oc6yDsWA
| 37,436
|
facebook/opt-30b Cuda Allocation Error with version >= 4.50.0 code
|
{
"login": "inf3rnus",
"id": 5959983,
"node_id": "MDQ6VXNlcjU5NTk5ODM=",
"avatar_url": "https://avatars.githubusercontent.com/u/5959983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/inf3rnus",
"html_url": "https://github.com/inf3rnus",
"followers_url": "https://api.github.com/users/inf3rnus/followers",
"following_url": "https://api.github.com/users/inf3rnus/following{/other_user}",
"gists_url": "https://api.github.com/users/inf3rnus/gists{/gist_id}",
"starred_url": "https://api.github.com/users/inf3rnus/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/inf3rnus/subscriptions",
"organizations_url": "https://api.github.com/users/inf3rnus/orgs",
"repos_url": "https://api.github.com/users/inf3rnus/repos",
"events_url": "https://api.github.com/users/inf3rnus/events{/privacy}",
"received_events_url": "https://api.github.com/users/inf3rnus/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-11T01:21:22
| 2025-05-20T08:02:50
| 2025-05-20T08:02:50
|
CONTRIBUTOR
| null | null | null | null |
### System Info
Ubuntu 24.04 LTS
torch==2.6.0
transformers==4.51.2
AWS EC2 g4dn.metal
### General Description
Hey all,
Figure these are some growing pains with the latest release.
On an aws ec2 g4dn.metal the model loads fine, no issues with version 4.49.0
Something has broke with loading sharded models for versions > 4.50.0 (I've tested up to the latest release, which as of this post is 4.51.2)
Note, I was observing memory utilization with `watch -n 0.1 nvidia-smi`
If it's helpful to know, it looks like there's an allocation step performed before weights are loaded, that fills each GPU with the same amount of data, I don't remember if it's 1GB or 10GB, but it's even for each GPU.
The old version loads weights one by one and appears to clear the GPUs of any preallocated memory before loading all the weights.
My theory is something is not being properly released before the weights are actually loaded.
This is with `torch==2.6.0`, I've not tested other torch versions, but I'm inclined to believe the issue is with model_utils.py and not with underlying code.
Final note, I tried version 0.32 of accelerate and 1.6.0, but that had zero impact.
Best,
Aaron
### Who can help?
@Rocketknight1
### Reproduction
To replicate, just run this code on a g4dn.metal instance.
First with `transformers==4.49.0` which should work and then with `transformers==4.51.2` which should fail.
```py
from transformers import pipeline
# on a g6.48xlarge we got 33s in serial
if __name__ == "__main__":
task = "text-generation"
model_id = "facebook/opt-30b"
pipe = pipeline(task=task, model=model_id, device_map="auto")
```
You should see that it loads with 4.49.0, and that you get a cuda OOM on 4.51.2
### Expected behavior
No OOM
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37436/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37436/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37435
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37435/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37435/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37435/events
|
https://github.com/huggingface/transformers/pull/37435
| 2,986,699,189
|
PR_kwDOCUB6oc6SJ7Ee
| 37,435
|
fix issue that some example with no trainer use accelerator.end_train…
|
{
"login": "we1559",
"id": 5505793,
"node_id": "MDQ6VXNlcjU1MDU3OTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/5505793?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/we1559",
"html_url": "https://github.com/we1559",
"followers_url": "https://api.github.com/users/we1559/followers",
"following_url": "https://api.github.com/users/we1559/following{/other_user}",
"gists_url": "https://api.github.com/users/we1559/gists{/gist_id}",
"starred_url": "https://api.github.com/users/we1559/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/we1559/subscriptions",
"organizations_url": "https://api.github.com/users/we1559/orgs",
"repos_url": "https://api.github.com/users/we1559/repos",
"events_url": "https://api.github.com/users/we1559/events{/privacy}",
"received_events_url": "https://api.github.com/users/we1559/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T19:38:16
| 2025-04-18T15:59:42
| 2025-04-18T15:59:42
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37435",
"html_url": "https://github.com/huggingface/transformers/pull/37435",
"diff_url": "https://github.com/huggingface/transformers/pull/37435.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37435.patch",
"merged_at": "2025-04-18T15:59:42"
}
|
…ing in a wrong way
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
fix [issue#37434](https://github.com/huggingface/transformers/issues/37434#issue-2986683937)
@ArthurZucker hi, I think you should review this PR as I fix some issue with the text models examples.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37435/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37435/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37434
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37434/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37434/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37434/events
|
https://github.com/huggingface/transformers/issues/37434
| 2,986,683,937
|
I_kwDOCUB6oc6yBS4h
| 37,434
|
example with no trainer use accelerator.end_training() in a wrong way
|
{
"login": "we1559",
"id": 5505793,
"node_id": "MDQ6VXNlcjU1MDU3OTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/5505793?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/we1559",
"html_url": "https://github.com/we1559",
"followers_url": "https://api.github.com/users/we1559/followers",
"following_url": "https://api.github.com/users/we1559/following{/other_user}",
"gists_url": "https://api.github.com/users/we1559/gists{/gist_id}",
"starred_url": "https://api.github.com/users/we1559/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/we1559/subscriptions",
"organizations_url": "https://api.github.com/users/we1559/orgs",
"repos_url": "https://api.github.com/users/we1559/repos",
"events_url": "https://api.github.com/users/we1559/events{/privacy}",
"received_events_url": "https://api.github.com/users/we1559/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T19:32:11
| 2025-04-18T15:59:43
| 2025-04-18T15:59:43
|
CONTRIBUTOR
| null | null | null | null |
### System Info
the main branch could reproduce
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
run the example with no trainer such as
`export TASK_NAME=mrpc
accelerate launch run_glue_no_trainer.py \
--multi_gpu
--num_processes 4
--with_tracking
--model_name_or_path google-bert/bert-base-cased \
--task_name $TASK_NAME \
--max_length 128 \
--per_device_train_batch_size 32 \
--learning_rate 2e-5 \
--num_train_epochs 3 \
--output_dir /tmp/$TASK_NAME/
`
it will fail here
https://github.com/huggingface/transformers/blob/9cda4265d61b0ecc276b705bd9b361a452106128/examples/pytorch/text-classification/run_glue_no_trainer.py#L641
the error message shows we don't init the process group
### Expected behavior
actually, I found all the example with no trainer use the accelerator.end_training() in a wrong way
there are 2 bugs here
for example
https://github.com/huggingface/transformers/blob/9cda4265d61b0ecc276b705bd9b361a452106128/examples/pytorch/text-classification/run_glue_no_trainer.py#L637-L646
the first bug:
it will end_training first and try to wait_for_everyone and then save the model
but we can see
https://github.com/huggingface/accelerate/blob/3b89987710e4edf1e55b23d5ac208525a06a70c4/src/accelerate/accelerator.py#L2982-L3001
the end_training do two things.
1. finsh all the tracks, this is what the author want to do in example_no_trainer code because it will check if with_tracking before call it.
2. destroy the process group
after destroy the process group, we can't call wait_for_everyone or any other code related to the distributed module.
so we should call the accelerator.end_training() at the end of main() to make sure we will destroy the process group after all the possible distributed behaviour.
the second bug:
it would only call the end_training() when with_tracking. But we should always destroy the process group with or without tracking
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37434/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37434/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37433
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37433/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37433/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37433/events
|
https://github.com/huggingface/transformers/pull/37433
| 2,986,682,231
|
PR_kwDOCUB6oc6SJ3QE
| 37,433
|
[Gemma3] compile at generation time ✨
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T19:31:11
| 2025-04-11T08:30:51
| 2025-04-11T08:30:50
|
MEMBER
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37433",
"html_url": "https://github.com/huggingface/transformers/pull/37433",
"diff_url": "https://github.com/huggingface/transformers/pull/37433.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37433.patch",
"merged_at": null
}
|
# What does this PR do?
WIP, needs code cleanup, some history, and a PR description
Enables `torch.compile` for Gemma3 (and fixes it for other models using `HybridCache` like Gemma2)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37433/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37433/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37432
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37432/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37432/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37432/events
|
https://github.com/huggingface/transformers/pull/37432
| 2,986,674,843
|
PR_kwDOCUB6oc6SJ1pg
| 37,432
|
allow accelerate prepare for torch compiled models
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-10T19:27:22
| 2025-04-10T19:27:37
| null |
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37432",
"html_url": "https://github.com/huggingface/transformers/pull/37432",
"diff_url": "https://github.com/huggingface/transformers/pull/37432.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37432.patch",
"merged_at": null
}
|
# What does this PR do?
FSDP2 does't work with torch compiled models since it has an optimized wrap around it.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@S1ro1
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37432/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37432/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37431
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37431/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37431/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37431/events
|
https://github.com/huggingface/transformers/issues/37431
| 2,986,656,024
|
I_kwDOCUB6oc6yBMEY
| 37,431
|
Trainer skipping to end of epoch
|
{
"login": "harryjulian",
"id": 103032375,
"node_id": "U_kgDOBiQmNw",
"avatar_url": "https://avatars.githubusercontent.com/u/103032375?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harryjulian",
"html_url": "https://github.com/harryjulian",
"followers_url": "https://api.github.com/users/harryjulian/followers",
"following_url": "https://api.github.com/users/harryjulian/following{/other_user}",
"gists_url": "https://api.github.com/users/harryjulian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harryjulian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harryjulian/subscriptions",
"organizations_url": "https://api.github.com/users/harryjulian/orgs",
"repos_url": "https://api.github.com/users/harryjulian/repos",
"events_url": "https://api.github.com/users/harryjulian/events{/privacy}",
"received_events_url": "https://api.github.com/users/harryjulian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T19:19:16
| 2025-04-11T14:30:49
| 2025-04-11T14:28:38
|
NONE
| null | null | null | null |
### System Info
Hi there,
I’m using transformers==4.49. I’m using the trainer with a webdataset, where I’m setting an approximate length to the webdataset, for compatibility with the trainer.
Regardless of what I do, and even if I use max steps rather than

num_train_epochs, my wandb dashboard is consistently showing the epoch value shooting up to the ‘end of the epoch’ (obviously an ‘epoch’ doesn’t really exist in wds) very prematurely.
Is this something to be worried about? I’m concerned a load of my data is being skipped for no reason and the dataloader is being reinitalized, starting from the beginning.
I may be able to share some example code if the answer isn’t obvious.
### Who can help?
@zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Initialise a training run with a webdataset.
### Expected behavior
Epochs are skipped.
|
{
"login": "harryjulian",
"id": 103032375,
"node_id": "U_kgDOBiQmNw",
"avatar_url": "https://avatars.githubusercontent.com/u/103032375?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/harryjulian",
"html_url": "https://github.com/harryjulian",
"followers_url": "https://api.github.com/users/harryjulian/followers",
"following_url": "https://api.github.com/users/harryjulian/following{/other_user}",
"gists_url": "https://api.github.com/users/harryjulian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/harryjulian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/harryjulian/subscriptions",
"organizations_url": "https://api.github.com/users/harryjulian/orgs",
"repos_url": "https://api.github.com/users/harryjulian/repos",
"events_url": "https://api.github.com/users/harryjulian/events{/privacy}",
"received_events_url": "https://api.github.com/users/harryjulian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37431/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37431/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37430
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37430/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37430/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37430/events
|
https://github.com/huggingface/transformers/pull/37430
| 2,986,569,591
|
PR_kwDOCUB6oc6SJfjJ
| 37,430
|
nit: typing use Llama4TextConfig instead of Llama4Config
|
{
"login": "kmehant",
"id": 15800200,
"node_id": "MDQ6VXNlcjE1ODAwMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/15800200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kmehant",
"html_url": "https://github.com/kmehant",
"followers_url": "https://api.github.com/users/kmehant/followers",
"following_url": "https://api.github.com/users/kmehant/following{/other_user}",
"gists_url": "https://api.github.com/users/kmehant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kmehant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmehant/subscriptions",
"organizations_url": "https://api.github.com/users/kmehant/orgs",
"repos_url": "https://api.github.com/users/kmehant/repos",
"events_url": "https://api.github.com/users/kmehant/events{/privacy}",
"received_events_url": "https://api.github.com/users/kmehant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T18:44:22
| 2025-04-11T16:29:35
| 2025-04-11T16:29:35
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37430",
"html_url": "https://github.com/huggingface/transformers/pull/37430",
"diff_url": "https://github.com/huggingface/transformers/pull/37430.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37430.patch",
"merged_at": "2025-04-11T16:29:34"
}
|
# What does this PR do?
nit fix, use text config typing instead.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
- text models: @ArthurZucker
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37430/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37430/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37429
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37429/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37429/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37429/events
|
https://github.com/huggingface/transformers/pull/37429
| 2,986,545,634
|
PR_kwDOCUB6oc6SJaRN
| 37,429
|
fix document masking for chunked attention
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T18:35:40
| 2025-05-09T06:22:01
| 2025-05-09T06:22:01
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37429",
"html_url": "https://github.com/huggingface/transformers/pull/37429",
"diff_url": "https://github.com/huggingface/transformers/pull/37429.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37429.patch",
"merged_at": "2025-05-09T06:22:00"
}
|
# What does this PR do?
The chunked attention was clobbering the document_ids meaning that the info to distinguish documents within the same chunk is gone. The proper way to do this is to simply generate a chunk mask and combine with the existing causal mask.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @SunMarc
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37429/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37429/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37428
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37428/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37428/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37428/events
|
https://github.com/huggingface/transformers/issues/37428
| 2,986,242,010
|
I_kwDOCUB6oc6x_m_a
| 37,428
|
ImportError: cannot import name '_flash_supports_window_size' from 'transformers.modeling_flash_attention_utils'
|
{
"login": "mv2731",
"id": 60421398,
"node_id": "MDQ6VXNlcjYwNDIxMzk4",
"avatar_url": "https://avatars.githubusercontent.com/u/60421398?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mv2731",
"html_url": "https://github.com/mv2731",
"followers_url": "https://api.github.com/users/mv2731/followers",
"following_url": "https://api.github.com/users/mv2731/following{/other_user}",
"gists_url": "https://api.github.com/users/mv2731/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mv2731/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mv2731/subscriptions",
"organizations_url": "https://api.github.com/users/mv2731/orgs",
"repos_url": "https://api.github.com/users/mv2731/repos",
"events_url": "https://api.github.com/users/mv2731/events{/privacy}",
"received_events_url": "https://api.github.com/users/mv2731/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T16:30:59
| 2025-09-10T10:15:31
| 2025-05-20T08:02:52
|
NONE
| null | null | null | null |
### System Info
Hi there,
I'm using tridao's flash attention and I'm running into an import error with the transformers library:
```
File "/g/g14/venkatraman2/glm/glm/train/training.py", line 34, in <module>
from glm.train.train_wrapper_registry import train_wrapper_registry
File "/g/g14/venkatraman2/glm/glm/train/train_wrapper_registry.py", line 1, in <module>
from .baseformer_train import BaseFormerWrapper
File "/g/g14/venkatraman2/glm/glm/train/baseformer_train.py", line 9, in <module>
from ..model.model_registry import model_registry
File "/g/g14/venkatraman2/glm/glm/model/model_registry.py", line 1, in <module>
from .esm3s import ESM3s
File "/g/g14/venkatraman2/glm/glm/model/esm3s.py", line 36, in <module>
from ring_flash_attn.ring_flash_attn_varlen import (
File "/p/vast1/OpenFoldCollab/genome_lm/envs/glm_rocm6_3_1_re/lib/python3.12/site-packages/ring_flash_attn/__init__.py", line 37, in <module>
from .adapters import (
File "/p/vast1/OpenFoldCollab/genome_lm/envs/glm_rocm6_3_1_re/lib/python3.12/site-packages/ring_flash_attn/adapters/__init__.py", line 1, in <module>
from .hf_adapter import (
File "/p/vast1/OpenFoldCollab/genome_lm/envs/glm_rocm6_3_1_re/lib/python3.12/site-packages/ring_flash_attn/adapters/hf_adapter.py", line 9, in <module>
from transformers.modeling_flash_attention_utils import (
ImportError: cannot import name '_flash_supports_window_size' from 'transformers.modeling_flash_attention_utils' (/p/vast1/OpenFoldCollab/genome_lm/envs/glm_rocm6_3_1_re/lib/python3.12/site-packages/transformers/modeling_flash_attention_utils.py)
```
Do you have any suggestions regarding how to resolve this?
Others have encountered this as well: https://github.com/Dao-AILab/flash-attention/issues/1491
Thank you!
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Steps to reproduce the behavior:
1. Import the following, after having cloned `ring_flash_attn` from the tridao repository.
```
from ring_flash_attn.ring_flash_attn_varlen import (
ring_flash_attn_varlen_kvpacked_func,
)```
### Expected behavior
I would expect '_flash_supports_window_size' to be importable from 'transformers.modeling_flash_attention_utils'
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37428/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37428/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37427
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37427/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37427/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37427/events
|
https://github.com/huggingface/transformers/pull/37427
| 2,986,223,412
|
PR_kwDOCUB6oc6SIUaw
| 37,427
|
Refixing require_read_token
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T16:24:44
| 2025-04-15T08:30:28
| 2025-04-15T08:30:28
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37427",
"html_url": "https://github.com/huggingface/transformers/pull/37427",
"diff_url": "https://github.com/huggingface/transformers/pull/37427.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37427.patch",
"merged_at": null
}
|
# What does this PR do?
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37427/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37427/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37426
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37426/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37426/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37426/events
|
https://github.com/huggingface/transformers/pull/37426
| 2,986,104,804
|
PR_kwDOCUB6oc6SH6Np
| 37,426
|
Adding to self_comment_ci.yml
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T15:40:46
| 2025-04-10T16:07:03
| 2025-04-10T15:46:56
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37426",
"html_url": "https://github.com/huggingface/transformers/pull/37426",
"diff_url": "https://github.com/huggingface/transformers/pull/37426.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37426.patch",
"merged_at": "2025-04-10T15:46:56"
}
|
# What does this PR do?
As stated above
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37426/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37426/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37425
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37425/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37425/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37425/events
|
https://github.com/huggingface/transformers/pull/37425
| 2,986,007,788
|
PR_kwDOCUB6oc6SHk55
| 37,425
|
convert scale and zero to cuda when using HQQ backend
|
{
"login": "phymhan",
"id": 6815830,
"node_id": "MDQ6VXNlcjY4MTU4MzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6815830?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phymhan",
"html_url": "https://github.com/phymhan",
"followers_url": "https://api.github.com/users/phymhan/followers",
"following_url": "https://api.github.com/users/phymhan/following{/other_user}",
"gists_url": "https://api.github.com/users/phymhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phymhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phymhan/subscriptions",
"organizations_url": "https://api.github.com/users/phymhan/orgs",
"repos_url": "https://api.github.com/users/phymhan/repos",
"events_url": "https://api.github.com/users/phymhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/phymhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T15:05:31
| 2025-04-16T12:13:20
| 2025-04-16T12:13:20
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37425",
"html_url": "https://github.com/huggingface/transformers/pull/37425",
"diff_url": "https://github.com/huggingface/transformers/pull/37425.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37425.patch",
"merged_at": "2025-04-16T12:13:20"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37425/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37425/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37424
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37424/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37424/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37424/events
|
https://github.com/huggingface/transformers/pull/37424
| 2,985,879,774
|
PR_kwDOCUB6oc6SHIrt
| 37,424
|
Add GGUF support to Gemma3 Text backbone
|
{
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T14:25:14
| 2025-04-20T09:52:58
| 2025-04-10T15:15:44
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37424",
"html_url": "https://github.com/huggingface/transformers/pull/37424",
"diff_url": "https://github.com/huggingface/transformers/pull/37424.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37424.patch",
"merged_at": "2025-04-10T15:15:44"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes #37002
- This PR add gguf support to gemma3 text backbone
- The ViT part is not included because `gguf` package hasn't had interoperability with mm_proj GGUF checkpoint
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@SunMarc @MekkCyber
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37424/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37424/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37423
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37423/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37423/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37423/events
|
https://github.com/huggingface/transformers/issues/37423
| 2,985,717,776
|
I_kwDOCUB6oc6x9nAQ
| 37,423
|
pytorch_utils.py > isin_mps_friendly > RuntimeError: Expected elements.dtype() == test_elements.dtype() to be true, but got false.
|
{
"login": "f2janyway",
"id": 55625423,
"node_id": "MDQ6VXNlcjU1NjI1NDIz",
"avatar_url": "https://avatars.githubusercontent.com/u/55625423?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/f2janyway",
"html_url": "https://github.com/f2janyway",
"followers_url": "https://api.github.com/users/f2janyway/followers",
"following_url": "https://api.github.com/users/f2janyway/following{/other_user}",
"gists_url": "https://api.github.com/users/f2janyway/gists{/gist_id}",
"starred_url": "https://api.github.com/users/f2janyway/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/f2janyway/subscriptions",
"organizations_url": "https://api.github.com/users/f2janyway/orgs",
"repos_url": "https://api.github.com/users/f2janyway/repos",
"events_url": "https://api.github.com/users/f2janyway/events{/privacy}",
"received_events_url": "https://api.github.com/users/f2janyway/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T13:33:46
| 2025-08-05T08:03:41
| 2025-08-05T08:03:41
|
NONE
| null | null | null | null |
I just followed this tutorial. but got error
https://huggingface.co/learn/audio-course/en/chapter2/tts_pipeline
```python
from transformers import pipeline
pipe = pipeline("text-to-speech", model="suno/bark-small")
text = "Ladybugs have had important roles in culture and religion, being associated with luck, love, fertility and prophecy. "
output = pipe(text)
```
```
RuntimeError: Expected elements.dtype() == test_elements.dtype() to be true, but got false.
```
https://github.com/huggingface/transformers/blob/7ecc5b88c0328aea91a3c9f8763f56b3b1e26767/src/transformers/pytorch_utils.py#L334
so I just added this code. and works fine. when I took a log with bug code the elements.dtype shown 'int32' but others elements.dtype, and test_elements.dtype 'int64'
```python
if elements.dtype != test_elements.dtype:
elements = elements.to(dtype=test_elements.dtype)
# Note: don't use named arguments in `torch.isin`, see https://github.com/pytorch/pytorch/issues/126045
return torch.isin(elements, test_elements)
```
### my env:
macbook pro m1
conda
transformers-cli env
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.51.1
- Platform: macOS-15.3.1-arm64-arm-64bit
- Python version: 3.10.12
- Huggingface_hub version: 0.30.2
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No. In fact. I don't know. I didn't set about these.
### Who can help?
@gante
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
just run above code.
### Expected behavior
RuntimeError
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37423/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37423/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37422
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37422/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37422/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37422/events
|
https://github.com/huggingface/transformers/pull/37422
| 2,985,714,410
|
PR_kwDOCUB6oc6SGkGb
| 37,422
|
Fix require_read_token
|
{
"login": "MekkCyber",
"id": 93391238,
"node_id": "U_kgDOBZEJhg",
"avatar_url": "https://avatars.githubusercontent.com/u/93391238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MekkCyber",
"html_url": "https://github.com/MekkCyber",
"followers_url": "https://api.github.com/users/MekkCyber/followers",
"following_url": "https://api.github.com/users/MekkCyber/following{/other_user}",
"gists_url": "https://api.github.com/users/MekkCyber/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MekkCyber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MekkCyber/subscriptions",
"organizations_url": "https://api.github.com/users/MekkCyber/orgs",
"repos_url": "https://api.github.com/users/MekkCyber/repos",
"events_url": "https://api.github.com/users/MekkCyber/events{/privacy}",
"received_events_url": "https://api.github.com/users/MekkCyber/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T13:32:46
| 2025-04-10T15:01:42
| 2025-04-10T15:01:41
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37422",
"html_url": "https://github.com/huggingface/transformers/pull/37422",
"diff_url": "https://github.com/huggingface/transformers/pull/37422.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37422.patch",
"merged_at": "2025-04-10T15:01:41"
}
|
# What does this PR do?
Add require_read_token to functions instead of classes
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37422/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37422/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37420
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37420/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37420/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37420/events
|
https://github.com/huggingface/transformers/issues/37420
| 2,985,244,963
|
I_kwDOCUB6oc6x7zkj
| 37,420
|
Missing SpeechT5ForVoiceConversion class - ImportError after installation
|
{
"login": "ChanduBora",
"id": 17783003,
"node_id": "MDQ6VXNlcjE3NzgzMDAz",
"avatar_url": "https://avatars.githubusercontent.com/u/17783003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChanduBora",
"html_url": "https://github.com/ChanduBora",
"followers_url": "https://api.github.com/users/ChanduBora/followers",
"following_url": "https://api.github.com/users/ChanduBora/following{/other_user}",
"gists_url": "https://api.github.com/users/ChanduBora/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChanduBora/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChanduBora/subscriptions",
"organizations_url": "https://api.github.com/users/ChanduBora/orgs",
"repos_url": "https://api.github.com/users/ChanduBora/repos",
"events_url": "https://api.github.com/users/ChanduBora/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChanduBora/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T10:43:35
| 2025-04-10T14:02:51
| 2025-04-10T14:02:49
|
NONE
| null | null | null | null |
### Feature request
from transformers import SpeechT5Processor, SpeechT5ForVoiceConversion
ImportError: cannot import name 'SpeechT5ForVoiceConversion' from 'transformers'
### Motivation
pip show transformers
Name: transformers
Version: 4.51.1
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /home/cbora/Gemini_Field_310_env/lib/python3.10/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by:
pip install --upgrade transformers (and the resulting version, e.g., 4.51.1)
pip install --force-reinstall transformers==4.35.0
pip install --force-reinstall --pre transformers (and the resulting pre-release version, if you noted it)
Installation from source (pip install . and the resulting dev version, e.g., 4.52.0.dev0)
### Your contribution
pip install --upgrade transformers (and the resulting version, e.g., 4.51.1)
pip install --force-reinstall transformers==4.35.0
pip install --force-reinstall --pre transformers (and the resulting pre-release version, if you noted it)
Installation from source (pip install . and the resulting dev version, e.g., 4.52.0.dev0)
Upgrading/reinstalling transformers (various versions).
Installing from source.
Inspecting __init__.py files (and the fact that grep found no definition).
Checking sys.path and PYTHONPATH.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37420/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37420/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37419
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37419/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37419/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37419/events
|
https://github.com/huggingface/transformers/pull/37419
| 2,985,144,824
|
PR_kwDOCUB6oc6SEloB
| 37,419
|
update `kernels` to 0.4.3
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T10:02:54
| 2025-04-10T10:30:02
| 2025-04-10T10:14:22
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37419",
"html_url": "https://github.com/huggingface/transformers/pull/37419",
"diff_url": "https://github.com/huggingface/transformers/pull/37419.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37419.patch",
"merged_at": "2025-04-10T10:14:22"
}
|
# What does this PR do?
This makes sure we can disable the kernels if needed!
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37419/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37419/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37418
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37418/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37418/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37418/events
|
https://github.com/huggingface/transformers/pull/37418
| 2,985,119,487
|
PR_kwDOCUB6oc6SEgC7
| 37,418
|
use `rms_norm_eps` for the L2Norm for Llama4
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-10T09:53:15
| 2025-04-10T11:33:52
| 2025-04-10T11:33:51
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37418",
"html_url": "https://github.com/huggingface/transformers/pull/37418",
"diff_url": "https://github.com/huggingface/transformers/pull/37418.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37418.patch",
"merged_at": "2025-04-10T11:33:51"
}
|
# What does this PR do?
It seems like this changes throughout the port, updating and will test!
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37418/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37418/timeline
| null | null | null | null | true
| true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.