url
string | repository_url
string | labels_url
string | comments_url
string | events_url
string | html_url
string | id
int64 | node_id
string | number
int64 | title
string | user
dict | labels
list | state
string | locked
bool | assignee
dict | assignees
list | milestone
null | comments
list | created_at
timestamp[ms] | updated_at
timestamp[ms] | closed_at
timestamp[ms] | author_association
string | type
dict | active_lock_reason
null | draft
bool | pull_request
dict | body
string | closed_by
dict | reactions
dict | timeline_url
string | performed_via_github_app
null | state_reason
string | sub_issues_summary
dict | issue_dependencies_summary
dict | is_pull_request
bool | is_closed
bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/37317
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37317/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37317/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37317/events
|
https://github.com/huggingface/transformers/issues/37317
| 2,974,679,850
|
I_kwDOCUB6oc6xTgMq
| 37,317
|
Llama 4 failing with mps not supported
|
{
"login": "SethBurkart123",
"id": 108050083,
"node_id": "U_kgDOBnC2ow",
"avatar_url": "https://avatars.githubusercontent.com/u/108050083?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SethBurkart123",
"html_url": "https://github.com/SethBurkart123",
"followers_url": "https://api.github.com/users/SethBurkart123/followers",
"following_url": "https://api.github.com/users/SethBurkart123/following{/other_user}",
"gists_url": "https://api.github.com/users/SethBurkart123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SethBurkart123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SethBurkart123/subscriptions",
"organizations_url": "https://api.github.com/users/SethBurkart123/orgs",
"repos_url": "https://api.github.com/users/SethBurkart123/repos",
"events_url": "https://api.github.com/users/SethBurkart123/events{/privacy}",
"received_events_url": "https://api.github.com/users/SethBurkart123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-06T04:27:50
| 2025-04-06T05:59:52
| 2025-04-06T05:59:52
|
NONE
| null | null | null | null |
### System Info
I'm trying to run the new `meta-llama/Llama-4-Maverick-17B-128E-Instruct` model with hf/transformers version 4.5.1 on my Mac M4 Max. However, I'm running into:
```
Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████| 55/55 [00:07<00:00, 7.65it/s]
Some parameters are on the meta device because they were offloaded to the disk.
Traceback (most recent call last):
File "/Users/admin/Documents/AI/llama4-distil/main.py", line 38, in <module>
outputs = model.generate(
^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/generation/utils.py", line 2460, in generate
result = self._sample(
^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/generation/utils.py", line 3426, in _sample
outputs = self(**model_inputs, return_dict=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/accelerate/hooks.py", line 176, in new_forward
output = module._old_forward(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/models/llama4/modeling_llama4.py", line 1761, in forward
outputs = self.language_model(
^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/models/llama4/modeling_llama4.py", line 1015, in forward
outputs = self.model(
^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/models/llama4/modeling_llama4.py", line 669, in forward
causal_mask, chunk_causal_mask = self._update_causal_mask(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/models/llama4/modeling_llama4.py", line 776, in _update_causal_mask
chunked_attention_mask = make_flex_block_causal_mask(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/transformers/integrations/flex_attention.py", line 140, in make_flex_block_causal_mask
return create_block_causal_mask_flex(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/attention/flex_attention.py", line 882, in create_block_mask
return torch.compile(create_block_mask)(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py", line 574, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/nn/attention/flex_attention.py", line 823, in create_block_mask
def create_block_mask(
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 1380, in __call__
return self._torchdynamo_orig_callable(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 1164, in __call__
result = self._inner_convert(
^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 547, in __call__
return _compile(
^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 986, in _compile
guarded_code = compile_inner(code, one_graph, hooks, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 715, in compile_inner
return _compile_inner(code, one_graph, hooks, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_utils_internal.py", line 95, in wrapper_function
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 750, in _compile_inner
out_code = transform_code_object(code, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/bytecode_transformation.py", line 1361, in transform_code_object
transformations(instructions, code_options)
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 231, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 662, in transform
tracer.run()
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 2868, in run
super().run()
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 1052, in run
while self.step():
^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 962, in step
self.dispatch_table[inst.opcode](self, inst)
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 657, in wrapper
return handle_graph_break(self, inst, speculation.reason)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 698, in handle_graph_break
self.output.compile_subgraph(self, reason=reason)
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1136, in compile_subgraph
self.compile_and_call_fx_graph(
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1382, in compile_and_call_fx_graph
compiled_fn = self.call_user_compiler(gm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1432, in call_user_compiler
return self._call_user_compiler(gm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1483, in _call_user_compiler
raise BackendCompilerFailed(self.compiler_fn, e).with_traceback(
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1462, in _call_user_compiler
compiled_fn = compiler_fn(gm, self.example_inputs())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/repro/after_dynamo.py", line 130, in __call__
compiled_gm = compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/__init__.py", line 2340, in __call__
return compile_fx(model_, inputs_, config_patches=self.config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1863, in compile_fx
return aot_autograd(
^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/backends/common.py", line 83, in __call__
cg = aot_module_simplified(gm, example_inputs, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 1155, in aot_module_simplified
compiled_fn = dispatch_and_compile()
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 1131, in dispatch_and_compile
compiled_fn, _ = create_aot_dispatcher_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 580, in create_aot_dispatcher_function
return _create_aot_dispatcher_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 830, in _create_aot_dispatcher_function
compiled_fn, fw_metadata = compiler_fn(
^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/jit_compile_runtime_wrappers.py", line 203, in aot_dispatch_base
compiled_fw = compiler(fw_module, updated_flat_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 489, in __call__
return self.compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1741, in fw_compiler_base
return inner_compile(
^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 569, in compile_fx_inner
return wrap_compiler_debug(_compile_fx_inner, compiler_name="inductor")(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_dynamo/repro/after_aot.py", line 102, in debug_wrapper
inner_compiled_fn = compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 685, in _compile_fx_inner
mb_compiled_graph = fx_codegen_and_compile(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1129, in fx_codegen_and_compile
return scheme.codegen_and_compile(gm, example_inputs, inputs_to_check, graph_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1044, in codegen_and_compile
compiled_fn = graph.compile_to_module().call
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/graph.py", line 2027, in compile_to_module
return self._compile_to_module()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/graph.py", line 2033, in _compile_to_module
self.codegen_with_cpp_wrapper() if self.cpp_wrapper else self.codegen()
^^^^^^^^^^^^^^
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/graph.py", line 1962, in codegen
self.init_wrapper_code()
File "/Users/admin/Documents/AI/llama4-distil/.venv/lib/python3.11/site-packages/torch/_inductor/graph.py", line 1845, in init_wrapper_code
wrapper_code_gen_cls is not None
torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
AssertionError: Device mps not supported
Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information
You can suppress this exception and fall back to eager by setting:
import torch._dynamo
torch._dynamo.config.suppress_errors = True
```
### Who can help?
_No response_
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I'm just running it with the sample code here:
```python
from transformers import AutoProcessor, Llama4ForConditionalGeneration
import torch
model_id = "meta-llama/Llama-4-Maverick-17B-128E-Instruct"
processor = AutoProcessor.from_pretrained(model_id)
model = Llama4ForConditionalGeneration.from_pretrained(
model_id,
attn_implementation="flex_attention",
device_map="auto",
torch_dtype=torch.bfloat16,
)
url1 = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/0052a70beed5bf71b92610a43a52df6d286cd5f3/diffusers/rabbit.jpg"
url2 = "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/datasets/cat_style_layout.png"
messages = [
{
"role": "user",
"content": [
{"type": "image", "url": url1},
{"type": "image", "url": url2},
{"type": "text", "text": "Can you describe how these two images are similar, and how they differ?"},
]
},
]
inputs = processor.apply_chat_template(
messages,
add_generation_prompt=True,
tokenize=True,
return_dict=True,
return_tensors="pt",
).to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=256,
)
response = processor.batch_decode(outputs[:, inputs["input_ids"].shape[-1]:])[0]
print(response)
print(outputs[0])
```
Here's my installed dependencies with uv:
```
Package Version
------------------ ---------
accelerate 1.6.0
certifi 2025.1.31
charset-normalizer 3.4.1
filelock 3.18.0
fsspec 2025.3.2
huggingface-hub 0.30.1
idna 3.10
jinja2 3.1.6
markupsafe 3.0.2
mpmath 1.3.0
networkx 3.4.2
numpy 2.2.4
packaging 24.2
pillow 11.1.0
psutil 7.0.0
pyyaml 6.0.2
regex 2024.11.6
requests 2.32.3
safetensors 0.5.3
sympy 1.13.1
tokenizers 0.21.1
torch 2.6.0
torchvision 0.21.0
tqdm 4.67.1
transformers 4.51.0
typing-extensions 4.13.1
urllib3 2.3.0
```
### Expected behavior
The model should be run.
|
{
"login": "SethBurkart123",
"id": 108050083,
"node_id": "U_kgDOBnC2ow",
"avatar_url": "https://avatars.githubusercontent.com/u/108050083?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SethBurkart123",
"html_url": "https://github.com/SethBurkart123",
"followers_url": "https://api.github.com/users/SethBurkart123/followers",
"following_url": "https://api.github.com/users/SethBurkart123/following{/other_user}",
"gists_url": "https://api.github.com/users/SethBurkart123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SethBurkart123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SethBurkart123/subscriptions",
"organizations_url": "https://api.github.com/users/SethBurkart123/orgs",
"repos_url": "https://api.github.com/users/SethBurkart123/repos",
"events_url": "https://api.github.com/users/SethBurkart123/events{/privacy}",
"received_events_url": "https://api.github.com/users/SethBurkart123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37317/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37317/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37316
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37316/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37316/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37316/events
|
https://github.com/huggingface/transformers/issues/37316
| 2,974,622,499
|
I_kwDOCUB6oc6xTSMj
| 37,316
|
Lama4 scout. Any chance it could ever be in the browser?
|
{
"login": "hpssjellis",
"id": 5605614,
"node_id": "MDQ6VXNlcjU2MDU2MTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5605614?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hpssjellis",
"html_url": "https://github.com/hpssjellis",
"followers_url": "https://api.github.com/users/hpssjellis/followers",
"following_url": "https://api.github.com/users/hpssjellis/following{/other_user}",
"gists_url": "https://api.github.com/users/hpssjellis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hpssjellis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hpssjellis/subscriptions",
"organizations_url": "https://api.github.com/users/hpssjellis/orgs",
"repos_url": "https://api.github.com/users/hpssjellis/repos",
"events_url": "https://api.github.com/users/hpssjellis/events{/privacy}",
"received_events_url": "https://api.github.com/users/hpssjellis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-06T01:58:14
| 2025-04-06T01:58:14
| null |
NONE
| null | null | null | null |
### Feature request
@xenova can the new 17B Lama 4 scout be loaded in the browser or is it much to big?
here is the huggingface collrection:
https://huggingface.co/collections/meta-llama/llama-4-67f0c30d9fe03840bc9d0164
### Motivation
Just a new model that might be able to load in the browser, but probably not
### Your contribution
I would make it more user friendly if it works.
Already done a bunch
https://hpssjellis.github.io/my-examples-of-transformersJS/public/index.html
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37316/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37316/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37315
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37315/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37315/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37315/events
|
https://github.com/huggingface/transformers/pull/37315
| 2,974,587,225
|
PR_kwDOCUB6oc6RhMRZ
| 37,315
|
Revert deepspeed z3 regressions
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-06T00:28:30
| 2025-04-06T19:04:30
| 2025-04-06T19:04:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37315",
"html_url": "https://github.com/huggingface/transformers/pull/37315",
"diff_url": "https://github.com/huggingface/transformers/pull/37315.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37315.patch",
"merged_at": null
}
|
# What does this PR do?
#36963 #37281 #37306 causes regressions with training with deepspeed zero3, see our axolotl integration tests on the latest 4.51.0 release that includes these commits/PRs that all fail with zero3 https://github.com/axolotl-ai-cloud/axolotl/actions/runs/14286223137/job/40041643515
@LysandreJik [seems to agree that these deepspeed changes should be reverted](https://github.com/huggingface/transformers/pull/37281#issuecomment-2780872520)
/cc @stas00
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@LysandreJik @SunMarc @zach-huggingface
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37315/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37315/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37314
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37314/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37314/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37314/events
|
https://github.com/huggingface/transformers/issues/37314
| 2,974,584,361
|
I_kwDOCUB6oc6xTI4p
| 37,314
|
OSError: meta-llama/Llama-4-Scout-17B-16E-Instruct does not appear to have a file named X
|
{
"login": "sam-h-bean",
"id": 43734688,
"node_id": "MDQ6VXNlcjQzNzM0Njg4",
"avatar_url": "https://avatars.githubusercontent.com/u/43734688?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sam-h-bean",
"html_url": "https://github.com/sam-h-bean",
"followers_url": "https://api.github.com/users/sam-h-bean/followers",
"following_url": "https://api.github.com/users/sam-h-bean/following{/other_user}",
"gists_url": "https://api.github.com/users/sam-h-bean/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sam-h-bean/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sam-h-bean/subscriptions",
"organizations_url": "https://api.github.com/users/sam-h-bean/orgs",
"repos_url": "https://api.github.com/users/sam-h-bean/repos",
"events_url": "https://api.github.com/users/sam-h-bean/events{/privacy}",
"received_events_url": "https://api.github.com/users/sam-h-bean/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-06T00:21:47
| 2025-05-14T08:02:41
| 2025-05-14T08:02:41
|
CONTRIBUTOR
| null | null | null | null |
### System Info
I have transformers 4.51.0 and am trying to load the Llama 4 Scout model for training. I loaded the safetensor files to disk and am pointing to that location with the `cache_dir` arg. It seems like each time I load the model with `AutoModelForCausalLM` I get a different error. One run it is
```bash
OSError: meta-llama/Llama-4-Scout-17B-16E-Instruct does not appear to have a file named model-00008-of-00050.safetensors. Checkout 'https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct/tree/main'for available files.
```
and the next run it is
```bash
OSError: meta-llama/Llama-4-Scout-17B-16E-Instruct does not appear to have files named ('model-00002-of-00050.safetensors', 'model-00003-of-00050.safetensors', 'model-00004-of-00050.safetensors', 'model-00005-of-00050.safetensors', 'model-00006-of-00050.safetensors', 'model-00007-of-00050.safetensors', 'model-00008-of-00050.safetensors', 'model-00011-of-00050.safetensors', 'model-00012-of-00050.safetensors', 'model-00014-of-00050.safetensors', 'model-00016-of-00050.safetensors', 'model-00017-of-00050.safetensors'). Checkout 'https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct/tree/main'for available files.
```
It seems to change each time I run the training script. Wondering if there is some error with loading from the cache for this many files.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Load the model with
```python
model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Llama-4-Scout-17B-16E-Instruct",
torch_dtype=torch.bfloat16,
token=HF_TOKEN,
attn_implementation="flash_attention_2",
)
```
### Expected behavior
Model is loaded into memory
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37314/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37314/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37313
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37313/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37313/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37313/events
|
https://github.com/huggingface/transformers/pull/37313
| 2,974,583,759
|
PR_kwDOCUB6oc6RhLph
| 37,313
|
Feat add hive model
|
{
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-06T00:20:25
| 2025-04-06T19:31:17
| 2025-04-06T19:31:17
|
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37313",
"html_url": "https://github.com/huggingface/transformers/pull/37313",
"diff_url": "https://github.com/huggingface/transformers/pull/37313.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37313.patch",
"merged_at": null
}
|
added hive model a model trainied with crowd source gpu compute
|
{
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37313/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37313/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37312
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37312/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37312/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37312/events
|
https://github.com/huggingface/transformers/pull/37312
| 2,974,563,520
|
PR_kwDOCUB6oc6RhIjJ
| 37,312
|
Add `segmentation_maps` support to MobileNetV2ImageProcessor
|
{
"login": "simonreise",
"id": 43753582,
"node_id": "MDQ6VXNlcjQzNzUzNTgy",
"avatar_url": "https://avatars.githubusercontent.com/u/43753582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/simonreise",
"html_url": "https://github.com/simonreise",
"followers_url": "https://api.github.com/users/simonreise/followers",
"following_url": "https://api.github.com/users/simonreise/following{/other_user}",
"gists_url": "https://api.github.com/users/simonreise/gists{/gist_id}",
"starred_url": "https://api.github.com/users/simonreise/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/simonreise/subscriptions",
"organizations_url": "https://api.github.com/users/simonreise/orgs",
"repos_url": "https://api.github.com/users/simonreise/repos",
"events_url": "https://api.github.com/users/simonreise/events{/privacy}",
"received_events_url": "https://api.github.com/users/simonreise/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T23:29:27
| 2025-07-07T18:09:54
| 2025-07-07T17:34:59
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37312",
"html_url": "https://github.com/huggingface/transformers/pull/37312",
"diff_url": "https://github.com/huggingface/transformers/pull/37312.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37312.patch",
"merged_at": "2025-07-07T17:34:59"
}
|
MobileNetV2 can do semantic segmentation, but its ImageProcessor can not process `segmentation_maps`.
This PR adds `segmentation_maps` support to MobileNetV2ImageProcessor. It also adds `reduce_labels` support and restructures old `preprocess` function to separate `_preprocess`, `_preprocess_image` and `_preprocess_mask` functions just like it is done e.g. in SegFormer image processor.
This PR also adds `reduce_labels` support to MobileViTImageProcessor to keep it consistent with other image processors.
I also added the required tests.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@amyeroberts, @qubvel
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37312/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37312/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37311
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37311/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37311/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37311/events
|
https://github.com/huggingface/transformers/issues/37311
| 2,974,547,862
|
I_kwDOCUB6oc6xS_-W
| 37,311
|
AutoModel.from_pretrained without accelerate raises a NameError because `init_empty_weights` is not available
|
{
"login": "LoicGrobol",
"id": 14248012,
"node_id": "MDQ6VXNlcjE0MjQ4MDEy",
"avatar_url": "https://avatars.githubusercontent.com/u/14248012?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LoicGrobol",
"html_url": "https://github.com/LoicGrobol",
"followers_url": "https://api.github.com/users/LoicGrobol/followers",
"following_url": "https://api.github.com/users/LoicGrobol/following{/other_user}",
"gists_url": "https://api.github.com/users/LoicGrobol/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LoicGrobol/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LoicGrobol/subscriptions",
"organizations_url": "https://api.github.com/users/LoicGrobol/orgs",
"repos_url": "https://api.github.com/users/LoicGrobol/repos",
"events_url": "https://api.github.com/users/LoicGrobol/events{/privacy}",
"received_events_url": "https://api.github.com/users/LoicGrobol/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T22:48:37
| 2025-04-07T21:12:22
| 2025-04-07T21:12:02
|
NONE
| null | null | null | null |
[d1b92369ca193](https://github.com/huggingface/transformers/blame/d1b92369ca193da49f9f7ecd01b08ece45c2c9aa/src/transformers/modeling_utils.py#L3736) introduced a call to `init_empty_weights` that leads to an error if accelerate is not installed. This should be guarded by a condition that accelerate is available (one already guards the import, but not the actual call).
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37311/reactions",
"total_count": 15,
"+1": 15,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37311/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37310
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37310/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37310/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37310/events
|
https://github.com/huggingface/transformers/pull/37310
| 2,974,524,067
|
PR_kwDOCUB6oc6RhCgO
| 37,310
|
feat: added hive model
|
{
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T21:53:06
| 2025-04-06T00:04:00
| 2025-04-06T00:04:00
|
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37310",
"html_url": "https://github.com/huggingface/transformers/pull/37310",
"diff_url": "https://github.com/huggingface/transformers/pull/37310.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37310.patch",
"merged_at": null
}
|
Added a new model that I am working on. It will be trained on crowd sourced GPU compute.
|
{
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37310/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37310/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37309
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37309/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37309/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37309/events
|
https://github.com/huggingface/transformers/pull/37309
| 2,974,460,318
|
PR_kwDOCUB6oc6Rg40P
| 37,309
|
Refactor ColPali model documentation
|
{
"login": "Soum-Soum",
"id": 24224767,
"node_id": "MDQ6VXNlcjI0MjI0NzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/24224767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Soum-Soum",
"html_url": "https://github.com/Soum-Soum",
"followers_url": "https://api.github.com/users/Soum-Soum/followers",
"following_url": "https://api.github.com/users/Soum-Soum/following{/other_user}",
"gists_url": "https://api.github.com/users/Soum-Soum/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Soum-Soum/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Soum-Soum/subscriptions",
"organizations_url": "https://api.github.com/users/Soum-Soum/orgs",
"repos_url": "https://api.github.com/users/Soum-Soum/repos",
"events_url": "https://api.github.com/users/Soum-Soum/events{/privacy}",
"received_events_url": "https://api.github.com/users/Soum-Soum/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T19:45:20
| 2025-04-15T20:52:11
| 2025-04-15T20:52:11
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37309",
"html_url": "https://github.com/huggingface/transformers/pull/37309",
"diff_url": "https://github.com/huggingface/transformers/pull/37309.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37309.patch",
"merged_at": "2025-04-15T20:52:11"
}
|
# What does this PR do?
As suggested in this issue - #36979 - this PR updates the documentation of the ColPali model, which will now be aligned with the standardized format for all the docs.
## Who can review?
@stevhliu, please let me know if any changes are needed.
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37309/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37309/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37308
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37308/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37308/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37308/events
|
https://github.com/huggingface/transformers/pull/37308
| 2,974,448,285
|
PR_kwDOCUB6oc6Rg2wk
| 37,308
|
Update ColPali model doc
|
{
"login": "Soum-Soum",
"id": 24224767,
"node_id": "MDQ6VXNlcjI0MjI0NzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/24224767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Soum-Soum",
"html_url": "https://github.com/Soum-Soum",
"followers_url": "https://api.github.com/users/Soum-Soum/followers",
"following_url": "https://api.github.com/users/Soum-Soum/following{/other_user}",
"gists_url": "https://api.github.com/users/Soum-Soum/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Soum-Soum/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Soum-Soum/subscriptions",
"organizations_url": "https://api.github.com/users/Soum-Soum/orgs",
"repos_url": "https://api.github.com/users/Soum-Soum/repos",
"events_url": "https://api.github.com/users/Soum-Soum/events{/privacy}",
"received_events_url": "https://api.github.com/users/Soum-Soum/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T19:20:28
| 2025-04-05T19:34:24
| 2025-04-05T19:34:23
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37308",
"html_url": "https://github.com/huggingface/transformers/pull/37308",
"diff_url": "https://github.com/huggingface/transformers/pull/37308.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37308.patch",
"merged_at": null
}
|
# What does this PR do?
As suggested in this issue - https://github.com/huggingface/transformers/issues/36979#issue-2947704577 - this PR updates the documentation of the ColPali model, which will now be aligned with the standardized format for all the docs.
## Who can review?
@stevhliu, please let me know if any changes are needed.
|
{
"login": "Soum-Soum",
"id": 24224767,
"node_id": "MDQ6VXNlcjI0MjI0NzY3",
"avatar_url": "https://avatars.githubusercontent.com/u/24224767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Soum-Soum",
"html_url": "https://github.com/Soum-Soum",
"followers_url": "https://api.github.com/users/Soum-Soum/followers",
"following_url": "https://api.github.com/users/Soum-Soum/following{/other_user}",
"gists_url": "https://api.github.com/users/Soum-Soum/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Soum-Soum/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Soum-Soum/subscriptions",
"organizations_url": "https://api.github.com/users/Soum-Soum/orgs",
"repos_url": "https://api.github.com/users/Soum-Soum/repos",
"events_url": "https://api.github.com/users/Soum-Soum/events{/privacy}",
"received_events_url": "https://api.github.com/users/Soum-Soum/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37308/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37308/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37307
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37307/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37307/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37307/events
|
https://github.com/huggingface/transformers/pull/37307
| 2,974,441,091
|
PR_kwDOCUB6oc6Rg1lg
| 37,307
|
Add llama4
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
},
{
"id": 2760822153,
"node_id": "MDU6TGFiZWwyNzYwODIyMTUz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Tensor%20Parallel",
"name": "Tensor Parallel",
"color": "1AD0A8",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T19:06:40
| 2025-06-19T03:55:32
| 2025-04-05T20:02:23
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37307",
"html_url": "https://github.com/huggingface/transformers/pull/37307",
"diff_url": "https://github.com/huggingface/transformers/pull/37307.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37307.patch",
"merged_at": "2025-04-05T20:02:23"
}
|
# What does this PR do?
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37307/reactions",
"total_count": 61,
"+1": 21,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 18,
"rocket": 22,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37307/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37306
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37306/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37306/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37306/events
|
https://github.com/huggingface/transformers/pull/37306
| 2,974,407,262
|
PR_kwDOCUB6oc6Rgv8e
| 37,306
|
Fix deepspeed loading (part 2)
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T18:07:32
| 2025-04-05T21:17:33
| 2025-04-05T18:41:42
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37306",
"html_url": "https://github.com/huggingface/transformers/pull/37306",
"diff_url": "https://github.com/huggingface/transformers/pull/37306.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37306.patch",
"merged_at": "2025-04-05T18:41:42"
}
|
# What does this PR do?
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37306/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37306/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37305
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37305/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37305/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37305/events
|
https://github.com/huggingface/transformers/pull/37305
| 2,974,390,541
|
PR_kwDOCUB6oc6Rgs57
| 37,305
|
Hf Xet extra
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T17:37:47
| 2025-04-05T19:06:07
| 2025-04-05T19:06:05
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37305",
"html_url": "https://github.com/huggingface/transformers/pull/37305",
"diff_url": "https://github.com/huggingface/transformers/pull/37305.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37305.patch",
"merged_at": "2025-04-05T19:06:05"
}
|
Adds the extra for hf_xet
|
{
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37305/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37305/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37304
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37304/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37304/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37304/events
|
https://github.com/huggingface/transformers/pull/37304
| 2,974,293,033
|
PR_kwDOCUB6oc6Rgenx
| 37,304
|
Add Fast Image Processor for vilt
|
{
"login": "devxaitist",
"id": 65713225,
"node_id": "MDQ6VXNlcjY1NzEzMjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/65713225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/devxaitist",
"html_url": "https://github.com/devxaitist",
"followers_url": "https://api.github.com/users/devxaitist/followers",
"following_url": "https://api.github.com/users/devxaitist/following{/other_user}",
"gists_url": "https://api.github.com/users/devxaitist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/devxaitist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/devxaitist/subscriptions",
"organizations_url": "https://api.github.com/users/devxaitist/orgs",
"repos_url": "https://api.github.com/users/devxaitist/repos",
"events_url": "https://api.github.com/users/devxaitist/events{/privacy}",
"received_events_url": "https://api.github.com/users/devxaitist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T15:34:57
| 2025-05-13T15:41:33
| 2025-05-13T15:40:53
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37304",
"html_url": "https://github.com/huggingface/transformers/pull/37304",
"diff_url": "https://github.com/huggingface/transformers/pull/37304.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37304.patch",
"merged_at": "2025-05-13T15:40:53"
}
|
# What does this PR do?
Add ViltImageProcessorFast implementation for faster image processing using PyTorch. (#36978)
This PR adds a fast image processor for the Vilt model that leverages PyTorch and torchvision functions instead of PIL/numpy. The implementation improves performance by using tensor operations and enabling GPU processing.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@yonigozlan
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37304/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37304/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37303
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37303/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37303/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37303/events
|
https://github.com/huggingface/transformers/issues/37303
| 2,973,985,080
|
I_kwDOCUB6oc6xQ2k4
| 37,303
|
`push_to_hub()` for Llama 3.1 8B doesn't save `lm_head.weight` tensor
|
{
"login": "wizeng23",
"id": 10782997,
"node_id": "MDQ6VXNlcjEwNzgyOTk3",
"avatar_url": "https://avatars.githubusercontent.com/u/10782997?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wizeng23",
"html_url": "https://github.com/wizeng23",
"followers_url": "https://api.github.com/users/wizeng23/followers",
"following_url": "https://api.github.com/users/wizeng23/following{/other_user}",
"gists_url": "https://api.github.com/users/wizeng23/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wizeng23/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wizeng23/subscriptions",
"organizations_url": "https://api.github.com/users/wizeng23/orgs",
"repos_url": "https://api.github.com/users/wizeng23/repos",
"events_url": "https://api.github.com/users/wizeng23/events{/privacy}",
"received_events_url": "https://api.github.com/users/wizeng23/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T07:24:53
| 2025-04-07T21:35:57
| 2025-04-07T21:11:31
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.49.0
- Platform: Linux-6.8.0-1015-gcp-x86_64-with-glibc2.35
- Python version: 3.10.13
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.2.1
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA A100-SXM4-40GB
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```
import torch
import transformers
model = transformers.AutoModel.from_pretrained("meta-llama/Llama-3.1-8B", torch_dtype=torch.bfloat16)
model.push_to_hub('wizeng23/Llama-test')
tokenizer = transformers.AutoTokenizer.from_pretrained("meta-llama/Llama-3.1-8B")
tokenizer.push_to_hub('wizeng23/Llama-test')
```
### Expected behavior
I'd expect the model weights to be completely unchanged when saving the model. However, it seems the `lm_head.weight` is not saved at all. `model-00004-of-00004.safetensors` for Llama 3.1 8B 1.17GB, while in the saved model, it's 117MB: https://huggingface.co/wizeng23/Llama-test/tree/main. I checked the save tensor file, and the only difference is the missing lm head tensor (shape [128256, 4096]); this is 500M params, which seems to fully account for the missing size.
|
{
"login": "wizeng23",
"id": 10782997,
"node_id": "MDQ6VXNlcjEwNzgyOTk3",
"avatar_url": "https://avatars.githubusercontent.com/u/10782997?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wizeng23",
"html_url": "https://github.com/wizeng23",
"followers_url": "https://api.github.com/users/wizeng23/followers",
"following_url": "https://api.github.com/users/wizeng23/following{/other_user}",
"gists_url": "https://api.github.com/users/wizeng23/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wizeng23/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wizeng23/subscriptions",
"organizations_url": "https://api.github.com/users/wizeng23/orgs",
"repos_url": "https://api.github.com/users/wizeng23/repos",
"events_url": "https://api.github.com/users/wizeng23/events{/privacy}",
"received_events_url": "https://api.github.com/users/wizeng23/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37303/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37303/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37302
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37302/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37302/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37302/events
|
https://github.com/huggingface/transformers/issues/37302
| 2,973,965,933
|
I_kwDOCUB6oc6xQx5t
| 37,302
|
**ValueError: Unrecognized model in lmsys/vicuna-7b-v1.5. Should have a `model_type` key**
|
{
"login": "ZhanliangAaronWang",
"id": 145517455,
"node_id": "U_kgDOCKxrjw",
"avatar_url": "https://avatars.githubusercontent.com/u/145517455?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZhanliangAaronWang",
"html_url": "https://github.com/ZhanliangAaronWang",
"followers_url": "https://api.github.com/users/ZhanliangAaronWang/followers",
"following_url": "https://api.github.com/users/ZhanliangAaronWang/following{/other_user}",
"gists_url": "https://api.github.com/users/ZhanliangAaronWang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZhanliangAaronWang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZhanliangAaronWang/subscriptions",
"organizations_url": "https://api.github.com/users/ZhanliangAaronWang/orgs",
"repos_url": "https://api.github.com/users/ZhanliangAaronWang/repos",
"events_url": "https://api.github.com/users/ZhanliangAaronWang/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZhanliangAaronWang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T06:51:58
| 2025-05-06T13:24:07
| 2025-05-06T13:23:59
|
NONE
| null | null | null | null |
### System Info
ValueError: Unrecognized model in lmsys/vicuna-7b-v1.5. Should have a `model_type` key in its config.json, or contain one of the following strings in its name: albert, align, altclip, aria, aria_text, audio-spectrogram-transformer, autoformer, bamba, bark, bart, beit, bert, bert-generation, big_bird, bigbird_pegasus, biogpt, bit, blenderbot, blenderbot-small, blip, blip-2, bloom, bridgetower, bros, camembert, canine, chameleon, chinese_clip, chinese_clip_vision_model, clap, clip, clip_text_model, clip_vision_model, clipseg, clvp, code_llama, codegen, cohere, cohere2, colpali, conditional_detr, convbert, convnext, convnextv2, cpmant, ctrl, cvt, dab-detr, dac, data2vec-audio, data2vec-text, data2vec-vision, dbrx, deberta, deberta-v2, decision_transformer, deformable_detr, deit, depth_anything, depth_pro, deta, detr, diffllama, dinat, dinov2, dinov2_with_registers, distilbert, donut-swin, dpr, dpt, efficientformer, efficientnet, electra, emu3, encodec, encoder-decoder, ernie, ernie_m, esm, falcon, falcon_mamba, fastspeech2_conformer, flaubert, flava, fnet, focalnet, fsmt, funnel, fuyu, gemma, gemma2, git, glm, glpn, got_ocr2, gpt-sw3, gpt2, gpt_bigcode, gpt_neo, gpt_neox, gpt_neox_japanese, gptj, gptsan-japanese, granite, granitemoe, granitemoeshared, granitevision, graphormer, grounding-dino, groupvit, helium, hiera, hubert, ibert, idefics, idefics2, idefics3, idefics3_vision, ijepa, imagegpt, informer, instructblip, instructblipvideo, jamba, jetmoe, jukebox, kosmos-2, layoutlm, layoutlmv2, layoutlmv3, led, levit, lilt, llama, llava, llava_next, llava_next_video, llava_onevision, longformer, longt5, luke, lxmert, m2m_100, mamba, mamba2, marian, markuplm, mask2former, maskformer, maskformer-swin, mbart, mctct, mega, megatron-bert, mgp-str, mimi, mistral, mixtral, mllama, mobilebert, mobilenet_v1, mobilenet_v2, mobilevit, mobilevitv2, modernbert, moonshine, moshi, mpnet, mpt, mra, mt5, musicgen, musicgen_melody, mvp, nat, nemotron, nezha, nllb-moe, nougat, nystromformer, olmo, olmo2, olmoe, omdet-turbo, oneformer, open-llama, openai-gpt, opt, owlv2, owlvit, paligemma, patchtsmixer, patchtst, pegasus, pegasus_x, perceiver, persimmon, phi, phi3, phimoe, pix2struct, pixtral, plbart, poolformer, pop2piano, prophetnet, pvt, pvt_v2, qdqbert, qwen2, qwen2_5_vl, qwen2_audio, qwen2_audio_encoder, qwen2_moe, qwen2_vl, rag, realm, recurrent_gemma, reformer, regnet, rembert, resnet, retribert, roberta, roberta-prelayernorm, roc_bert, roformer, rt_detr, rt_detr_resnet, rt_detr_v2, rwkv, sam, seamless_m4t, seamless_m4t_v2, segformer, seggpt, sew, sew-d, siglip, siglip_vision_model, speech-encoder-decoder, speech_to_text, speech_to_text_2, speecht5, splinter, squeezebert, stablelm, starcoder2, superglue, superpoint, swiftformer, swin, swin2sr, swinv2, switch_transformers, t5, table-transformer, tapas, textnet, time_series_transformer, timesformer, timm_backbone, timm_wrapper, trajectory_transformer, transfo-xl, trocr, tvlt, tvp, udop, umt5, unispeech, unispeech-sat, univnet, upernet, van, video_llava, videomae, vilt, vipllava, vision-encoder-decoder, vision-text-dual-encoder, visual_bert, vit, vit_hybrid, vit_mae, vit_msn, vitdet, vitmatte, vitpose, vitpose_backbone, vits, vivit, wav2vec2, wav2vec2-bert, wav2vec2-conformer, wavlm, whisper, xclip, xglm, xlm, xlm-prophetnet, xlm-roberta, xlm-roberta-xl, xlnet, xmod, yolos, yoso, zamba, zamba2, zoedepth
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Does anyone encounter this issue recently? I met this issue with several models such as olmo2, llama3-tulu, vicuna-7B?
Do you have any ideas to solve this?
Thanks a lot!
### Expected behavior
run
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37302/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37302/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37301
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37301/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37301/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37301/events
|
https://github.com/huggingface/transformers/issues/37301
| 2,973,858,770
|
I_kwDOCUB6oc6xQXvS
| 37,301
|
[torch-xla 2.7] Change xm.xrt_world_size() to xr.world_size(). xm.get_ordinal() to xr.global_ordinal()
|
{
"login": "jeffhataws",
"id": 56947987,
"node_id": "MDQ6VXNlcjU2OTQ3OTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/56947987?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jeffhataws",
"html_url": "https://github.com/jeffhataws",
"followers_url": "https://api.github.com/users/jeffhataws/followers",
"following_url": "https://api.github.com/users/jeffhataws/following{/other_user}",
"gists_url": "https://api.github.com/users/jeffhataws/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jeffhataws/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeffhataws/subscriptions",
"organizations_url": "https://api.github.com/users/jeffhataws/orgs",
"repos_url": "https://api.github.com/users/jeffhataws/repos",
"events_url": "https://api.github.com/users/jeffhataws/events{/privacy}",
"received_events_url": "https://api.github.com/users/jeffhataws/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T04:44:28
| 2025-05-15T13:27:32
| 2025-04-28T14:27:43
|
CONTRIBUTOR
| null | null | null | null |
### System Info
In torch-xla version 2.7, xm.xrt_world_size() is replaced by xr.world_size() and xm.get_ordinal() is replaced by xr.global_ordinal() (xr is "import torch_xla.runtime as xr")
https://github.com/search?q=repo%3Ahuggingface%2Ftransformers+xrt_world_size&type=code
https://github.com/search?q=repo%3Ahuggingface%2Ftransformers+get_ordinal&type=code
Currently these show up as warnings in torch-xla 2.5/2.6:
```
WARNING:root:torch_xla.core.xla_model.xrt_world_size() will be removed in release 2.7. is deprecated. Use torch_xla.runtime.world_size instead.
WARNING:root:torch_xla.core.xla_model.xla_model.get_ordinal() will be removed in release 2.7. is deprecated. Use torch_xla.runtime.global_ordinal instead.
```
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. Run any finetuning/training test on Neuron or TPU
2. Look for warnings like below: WARNING:root:torch_xla.core.xla_model.xrt_world_size() will be removed in release 2.7. is deprecated. Use torch_xla.runtime.world_size instead.
WARNING:root:torch_xla.core.xla_model.xla_model.get_ordinal() will be removed in release 2.7. is deprecated. Use torch_xla.runtime.global_ordinal instead.
### Expected behavior
No warnings.
With torch-xla 2.7, you will see errors.
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37301/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37301/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37300
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37300/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37300/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37300/events
|
https://github.com/huggingface/transformers/issues/37300
| 2,973,848,049
|
I_kwDOCUB6oc6xQVHx
| 37,300
|
Deepspeed zero-3 failures in main
|
{
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T04:23:28
| 2025-04-07T21:09:41
| 2025-04-07T21:09:40
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.51.0.dev0
- Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35
- Python version: 3.11.11
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: 0.15.4
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
When running the axolotl multigpu test suite, I get multiple regressions on main, mostly involving deepspeed zero3.
Doing some bisecting on git and running tests, everything passes up to commit `880560040609b03e62cb2ee7ad505825efb158bb` and fails on the merge of this PR https://github.com/huggingface/transformers/pull/36963
```
[gw0] [ 43%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16.json-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16.json-2]
[gw1] [ 45%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16_cpuoffload_all.json-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16.json-1]
[gw0] [ 48%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16.json-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16_cpuoffload_all.json-1]
[gw1] [ 51%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16.json-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16.json-2]
[gw0] [ 54%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[True-deepspeed_configs/zero3_bf16_cpuoffload_all.json-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16_cpuoffload_all.json-1]
[gw1] [ 56%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16.json-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[True-2]
[gw0] [ 59%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16_cpuoffload_all.json-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16_cpuoffload_all.json-2]
[gw0] [ 62%] FAILED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero3_packed[False-deepspeed_configs/zero3_bf16_cpuoffload_all.json-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[True-1]
[gw1] [ 64%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[True-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[False-1]
[gw0] [ 67%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[True-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[False-2]
[gw1] [ 70%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[False-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[True-1]
[gw0] [ 72%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero2_packed[False-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[True-2]
[gw1] [ 75%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[True-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[False-1]
[gw0] [ 78%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[True-2]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[False-2]
[gw1] [ 81%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[False-1]
tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_fix_untrained_tokens
[gw0] [ 83%] PASSED tests/e2e/multigpu/test_llama.py::TestMultiGPULlama::test_ds_zero1_packed[False-2]
tests/e2e/multigpu/test_qwen2.py::TestMultiGPUQwen2::test_qlora_fsdp_dpo[Qwen/Qwen2-0.5B]
```
```
stderr: [rank0]: While copying the parameter named "model.layers.28.self_attn.q_proj.weight", whose dimensions in the model are torch.Size([576, 576]) and whose dimensions in the checkpoint are torch.Size([576, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.self_attn.k_proj.weight", whose dimensions in the model are torch.Size([192, 576]) and whose dimensions in the checkpoint are torch.Size([192, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.self_attn.v_proj.weight", whose dimensions in the model are torch.Size([192, 576]) and whose dimensions in the checkpoint are torch.Size([192, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.self_attn.o_proj.weight", whose dimensions in the model are torch.Size([576, 576]) and whose dimensions in the checkpoint are torch.Size([576, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.mlp.gate_proj.weight", whose dimensions in the model are torch.Size([1536, 576]) and whose dimensions in the checkpoint are torch.Size([1536, 576]), an exception occurred : ('Can
not copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.mlp.up_proj.weight", whose dimensions in the model are torch.Size([1536, 576]) and whose dimensions in the checkpoint are torch.Size([1536, 576]), an exception occurred : ('Canno
t copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.mlp.down_proj.weight", whose dimensions in the model are torch.Size([576, 1536]) and whose dimensions in the checkpoint are torch.Size([576, 1536]), an exception occurred : ('Can
not copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.input_layernorm.weight", whose dimensions in the model are torch.Size([576]) and whose dimensions in the checkpoint are torch.Size([576]), an exception occurred : ('Cannot copy o
ut of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.28.post_attention_layernorm.weight", whose dimensions in the model are torch.Size([576]) and whose dimensions in the checkpoint are torch.Size([576]), an exception occurred : ('Cann
ot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.self_attn.q_proj.weight", whose dimensions in the model are torch.Size([576, 576]) and whose dimensions in the checkpoint are torch.Size([576, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.self_attn.k_proj.weight", whose dimensions in the model are torch.Size([192, 576]) and whose dimensions in the checkpoint are torch.Size([192, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.self_attn.v_proj.weight", whose dimensions in the model are torch.Size([192, 576]) and whose dimensions in the checkpoint are torch.Size([192, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.self_attn.o_proj.weight", whose dimensions in the model are torch.Size([576, 576]) and whose dimensions in the checkpoint are torch.Size([576, 576]), an exception occurred : ('Ca
nnot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.mlp.gate_proj.weight", whose dimensions in the model are torch.Size([1536, 576]) and whose dimensions in the checkpoint are torch.Size([1536, 576]), an exception occurred : ('Can
not copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.mlp.up_proj.weight", whose dimensions in the model are torch.Size([1536, 576]) and whose dimensions in the checkpoint are torch.Size([1536, 576]), an exception occurred : ('Canno
t copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.mlp.down_proj.weight", whose dimensions in the model are torch.Size([576, 1536]) and whose dimensions in the checkpoint are torch.Size([576, 1536]), an exception occurred : ('Can
not copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.input_layernorm.weight", whose dimensions in the model are torch.Size([576]) and whose dimensions in the checkpoint are torch.Size([576]), an exception occurred : ('Cannot copy o
ut of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.layers.29.post_attention_layernorm.weight", whose dimensions in the model are torch.Size([576]) and whose dimensions in the checkpoint are torch.Size([576]), an exception occurred : ('Cann
ot copy out of meta tensor; no data!',).
stderr: [rank0]: While copying the parameter named "model.norm.weight", whose dimensions in the model are torch.Size([576]) and whose dimensions in the checkpoint are torch.Size([576]), an exception occurred : ('Cannot copy out of meta tensor; no
data!',).
```
### Expected behavior
zero3 should work
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37300/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37300/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37299
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37299/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37299/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37299/events
|
https://github.com/huggingface/transformers/issues/37299
| 2,973,650,979
|
I_kwDOCUB6oc6xPlAj
| 37,299
|
flex_attention support for Qwen2.5/Gemma is broken
|
{
"login": "flukeskywalker",
"id": 3215768,
"node_id": "MDQ6VXNlcjMyMTU3Njg=",
"avatar_url": "https://avatars.githubusercontent.com/u/3215768?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/flukeskywalker",
"html_url": "https://github.com/flukeskywalker",
"followers_url": "https://api.github.com/users/flukeskywalker/followers",
"following_url": "https://api.github.com/users/flukeskywalker/following{/other_user}",
"gists_url": "https://api.github.com/users/flukeskywalker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/flukeskywalker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/flukeskywalker/subscriptions",
"organizations_url": "https://api.github.com/users/flukeskywalker/orgs",
"repos_url": "https://api.github.com/users/flukeskywalker/repos",
"events_url": "https://api.github.com/users/flukeskywalker/events{/privacy}",
"received_events_url": "https://api.github.com/users/flukeskywalker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-05T00:51:34
| 2025-04-14T14:53:28
| 2025-04-14T14:53:28
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.50.3
- Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA GeForce RTX 4090
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
```python
import torch
from torch.nn.attention.flex_attention import create_block_mask
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "Qwen/Qwen2.5-0.5B" # AttributeError: 'BlockMask' object has no attribute 'dim'
# model_name = "meta-llama/Llama-3.2-1B" # works
model = AutoModelForCausalLM.from_pretrained(model_name, attn_implementation="flex_attention").to("cuda")
tokenizer = AutoTokenizer.from_pretrained(model_name)
print(model._supports_flex_attn)
tokens = tokenizer("Hello model, this is human. ")["input_ids"]
def causal_mask(b, h, q_idx, kv_idx):
return q_idx >= kv_idx
block_mask = create_block_mask(causal_mask, None, None, len(tokens), len(tokens), device="cuda")
model(input_ids=torch.tensor(tokens, dtype=torch.long, device="cuda").unsqueeze(0), attention_mask=block_mask)
```
### Expected behavior
This snippet should run without error, like it does for `meta-llama/Llama-3.2-1B`, since Qwen2.5 model is based on Llama arch and both support flex_attention (`_supports_flex_attention=True`).
The error occurs because `Qwen2Model._update_causal_mask()` doesn't handle the case when flex_attention is enabled and the block mask is passed in as the `attention_mask`. This is handled in `LlamaModel._update_causal_mask()`:
```python
if self.config._attn_implementation == "flex_attention":
if isinstance(attention_mask, torch.Tensor):
attention_mask = make_flex_block_causal_mask(attention_mask)
if isinstance(attention_mask, BlockMask):
return attention_mask
```
IIUC, adding the same handling to Qwen2Model should fix the issue, and indeed this works on my local fork. But Qwen2Model code is auto-generated, so it must be fixed elsewhere.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37299/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37299/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37298
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37298/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37298/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37298/events
|
https://github.com/huggingface/transformers/pull/37298
| 2,973,508,031
|
PR_kwDOCUB6oc6Rd8qB
| 37,298
|
Deprecate modeling_utils.py classes
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T22:47:29
| 2025-04-18T17:47:34
| 2025-04-18T17:47:34
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37298",
"html_url": "https://github.com/huggingface/transformers/pull/37298",
"diff_url": "https://github.com/huggingface/transformers/pull/37298.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37298.patch",
"merged_at": "2025-04-18T17:47:34"
}
|
# What does this PR do?
Classes such as `PoolerStartLogits`, `PoolerEndLogits`, `PoolerAnswerClass`, `SQuADHead`, and `SequenceSummary` are:
1. Defined in `modeling_utils.py`, which is somewhat against the main repository philosophy of having classes defined within dedicated modeling files
2. Used by a few older models and not used by newer ones
Following the above, I have transferred them to the corresponding modeling files and deprecated those in `modeling_utils.py`. This should unclutter `modeling_utils.py` and make it easier to remove those classes when models are going to be deprecated.
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37298/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37298/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37297
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37297/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37297/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37297/events
|
https://github.com/huggingface/transformers/pull/37297
| 2,973,507,012
|
PR_kwDOCUB6oc6Rd8b0
| 37,297
|
Added fast processor for llava-next-video model
|
{
"login": "ChathuminaVimukthi",
"id": 31965817,
"node_id": "MDQ6VXNlcjMxOTY1ODE3",
"avatar_url": "https://avatars.githubusercontent.com/u/31965817?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChathuminaVimukthi",
"html_url": "https://github.com/ChathuminaVimukthi",
"followers_url": "https://api.github.com/users/ChathuminaVimukthi/followers",
"following_url": "https://api.github.com/users/ChathuminaVimukthi/following{/other_user}",
"gists_url": "https://api.github.com/users/ChathuminaVimukthi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChathuminaVimukthi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChathuminaVimukthi/subscriptions",
"organizations_url": "https://api.github.com/users/ChathuminaVimukthi/orgs",
"repos_url": "https://api.github.com/users/ChathuminaVimukthi/repos",
"events_url": "https://api.github.com/users/ChathuminaVimukthi/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChathuminaVimukthi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-04T22:46:14
| 2025-06-26T19:38:48
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37297",
"html_url": "https://github.com/huggingface/transformers/pull/37297",
"diff_url": "https://github.com/huggingface/transformers/pull/37297.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37297.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37297/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37297/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37296
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37296/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37296/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37296/events
|
https://github.com/huggingface/transformers/issues/37296
| 2,973,501,370
|
I_kwDOCUB6oc6xPAe6
| 37,296
|
https://huggingface.co/hf-internal-testing tiny random models need to be converted to safetensors
|
{
"login": "sfc-gh-sbekman",
"id": 196988264,
"node_id": "U_kgDOC73NaA",
"avatar_url": "https://avatars.githubusercontent.com/u/196988264?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sfc-gh-sbekman",
"html_url": "https://github.com/sfc-gh-sbekman",
"followers_url": "https://api.github.com/users/sfc-gh-sbekman/followers",
"following_url": "https://api.github.com/users/sfc-gh-sbekman/following{/other_user}",
"gists_url": "https://api.github.com/users/sfc-gh-sbekman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sfc-gh-sbekman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sfc-gh-sbekman/subscriptions",
"organizations_url": "https://api.github.com/users/sfc-gh-sbekman/orgs",
"repos_url": "https://api.github.com/users/sfc-gh-sbekman/repos",
"events_url": "https://api.github.com/users/sfc-gh-sbekman/events{/privacy}",
"received_events_url": "https://api.github.com/users/sfc-gh-sbekman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T22:43:23
| 2025-05-06T18:48:03
| 2025-05-06T18:48:03
|
CONTRIBUTOR
| null | null | null | null |
The problem is that many transformers CI tests rely on these tiny models and they are mostly `pytorch_model.bin` format, e.g. see: https://huggingface.co/hf-internal-testing/tiny-random-T5Model/tree/main
But `modeling_utils` has massively changed recently and uses a different code pass for `safetensor` model files - and most modern models use that.
Which means transformers CI isn't testing the code properly.
For example a recent Deepspeed integration was broken because the tests use https://huggingface.co/patrickvonplaten/t5-tiny-random/tree/main but the code was doing something different for `pytorch_model.bin` files and thus a massive breakage introduced in https://github.com/huggingface/transformers/pull/36963 was missed. I looked at replacing it with https://huggingface.co/patrickvonplaten/t5-tiny-random/tree/main but it has the same issue - doesn't have a .safetensor file.
You can see why the massive testing model update is needed if we want the tests to actually test:
https://github.com/huggingface/transformers/blob/0ef339ff1b63bb03a388c79bfbebec9085e10564/tests/deepspeed/test_model_zoo.py#L56-L90
cc: @ydshieh, @LysandreJik
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37296/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37296/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37295
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37295/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37295/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37295/events
|
https://github.com/huggingface/transformers/pull/37295
| 2,973,430,888
|
PR_kwDOCUB6oc6Rdrky
| 37,295
|
<spam>
|
{
"login": "OSUer600",
"id": 206186736,
"node_id": "U_kgDODEoo8A",
"avatar_url": "https://avatars.githubusercontent.com/u/206186736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OSUer600",
"html_url": "https://github.com/OSUer600",
"followers_url": "https://api.github.com/users/OSUer600/followers",
"following_url": "https://api.github.com/users/OSUer600/following{/other_user}",
"gists_url": "https://api.github.com/users/OSUer600/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OSUer600/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OSUer600/subscriptions",
"organizations_url": "https://api.github.com/users/OSUer600/orgs",
"repos_url": "https://api.github.com/users/OSUer600/repos",
"events_url": "https://api.github.com/users/OSUer600/events{/privacy}",
"received_events_url": "https://api.github.com/users/OSUer600/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T21:52:17
| 2025-04-07T11:03:27
| 2025-04-07T11:03:20
|
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37295",
"html_url": "https://github.com/huggingface/transformers/pull/37295",
"diff_url": "https://github.com/huggingface/transformers/pull/37295.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37295.patch",
"merged_at": null
}
|
# What does this PR do?
test
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37295/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37295/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37294
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37294/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37294/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37294/events
|
https://github.com/huggingface/transformers/pull/37294
| 2,973,429,903
|
PR_kwDOCUB6oc6RdrYq
| 37,294
|
Update translation template
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T21:51:29
| 2025-04-07T16:29:39
| 2025-04-07T16:29:37
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37294",
"html_url": "https://github.com/huggingface/transformers/pull/37294",
"diff_url": "https://github.com/huggingface/transformers/pull/37294.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37294.patch",
"merged_at": "2025-04-07T16:29:37"
}
|
Updates translation template so we stop pinging Maria
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37294/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37294/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37293
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37293/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37293/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37293/events
|
https://github.com/huggingface/transformers/issues/37293
| 2,973,361,224
|
I_kwDOCUB6oc6xOeRI
| 37,293
|
[i18n-<Transformers>] Translating docs to <Tibetain>
|
{
"login": "OSUer600",
"id": 206186736,
"node_id": "U_kgDODEoo8A",
"avatar_url": "https://avatars.githubusercontent.com/u/206186736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OSUer600",
"html_url": "https://github.com/OSUer600",
"followers_url": "https://api.github.com/users/OSUer600/followers",
"following_url": "https://api.github.com/users/OSUer600/following{/other_user}",
"gists_url": "https://api.github.com/users/OSUer600/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OSUer600/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OSUer600/subscriptions",
"organizations_url": "https://api.github.com/users/OSUer600/orgs",
"repos_url": "https://api.github.com/users/OSUer600/repos",
"events_url": "https://api.github.com/users/OSUer600/events{/privacy}",
"received_events_url": "https://api.github.com/users/OSUer600/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2796628563,
"node_id": "MDU6TGFiZWwyNzk2NjI4NTYz",
"url": "https://api.github.com/repos/huggingface/transformers/labels/WIP",
"name": "WIP",
"color": "234C99",
"default": false,
"description": "Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress"
}
] |
open
| false
| null |
[] | null |
[] | 2025-04-04T21:00:57
| 2025-04-04T21:00:57
| null |
NONE
| null | null | null | null |
<!--
Note: Please search to see if an issue already exists for the language you are trying to translate.
-->
Hi!
Let's bring the documentation to all the <languageName>-speaking community 🌐 (currently 0 out of 267 complete)
Who would want to translate? Please follow the 🤗 [TRANSLATING guide](https://github.com/huggingface/transformers/blob/main/docs/TRANSLATING.md). Here is a list of the files ready for translation. Let us know in this issue if you'd like to translate any, and we'll add your name to the list.
Some notes:
* Please translate using an informal tone (imagine you are talking with a friend about transformers 🤗).
* Please translate in a gender-neutral way.
* Add your translations to the folder called `<languageCode>` inside the [source folder](https://github.com/huggingface/transformers/tree/main/docs/source).
* Register your translation in `<languageCode>/_toctree.yml`; please follow the order of the [English version](https://github.com/huggingface/transformers/blob/main/docs/source/en/_toctree.yml).
* Once you're finished, open a pull request and tag this issue by including #issue-number in the description, where issue-number is the number of this issue. Please ping @stevhliu and @MKhalusova for review.
* 🙋 If you'd like others to help you with the translation, you can also post in the 🤗 [forums](https://discuss.huggingface.co/).
## Get Started section
- [ ] [index.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/index.md) https://github.com/huggingface/transformers/pull/20180
- [ ] [quicktour.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/quicktour.md) (waiting for initial PR to go through)
- [ ] [installation.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/installation.md).
## Tutorial section
- [ ] [pipeline_tutorial.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/pipeline_tutorial.md)
- [ ] [autoclass_tutorial.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/autoclass_tutorial.md)
- [ ] [preprocessing.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/preprocessing.md)
- [ ] [training.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/training.md)
- [ ] [accelerate.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/accelerate.md)
- [ ] [model_sharing.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/model_sharing.md)
- [ ] [multilingual.md](https://github.com/huggingface/transformers/blob/main/docs/source/en/multilingual.md)
<!--
Keep on adding more as you go 🔥
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37293/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37293/timeline
| null | null |
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37292
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37292/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37292/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37292/events
|
https://github.com/huggingface/transformers/pull/37292
| 2,973,115,009
|
PR_kwDOCUB6oc6RcnqH
| 37,292
|
Add Fast Yolos Processor
|
{
"login": "keetrap",
"id": 103131112,
"node_id": "U_kgDOBiWn6A",
"avatar_url": "https://avatars.githubusercontent.com/u/103131112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keetrap",
"html_url": "https://github.com/keetrap",
"followers_url": "https://api.github.com/users/keetrap/followers",
"following_url": "https://api.github.com/users/keetrap/following{/other_user}",
"gists_url": "https://api.github.com/users/keetrap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keetrap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keetrap/subscriptions",
"organizations_url": "https://api.github.com/users/keetrap/orgs",
"repos_url": "https://api.github.com/users/keetrap/repos",
"events_url": "https://api.github.com/users/keetrap/events{/privacy}",
"received_events_url": "https://api.github.com/users/keetrap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T18:36:11
| 2025-04-15T12:23:09
| 2025-04-15T12:23:09
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37292",
"html_url": "https://github.com/huggingface/transformers/pull/37292",
"diff_url": "https://github.com/huggingface/transformers/pull/37292.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37292.patch",
"merged_at": "2025-04-15T12:23:09"
}
|
Related #36978
cc @yonigozlan
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37292/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37292/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37291
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37291/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37291/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37291/events
|
https://github.com/huggingface/transformers/pull/37291
| 2,972,976,808
|
PR_kwDOCUB6oc6RcKbl
| 37,291
|
Simplify Idefics2, Idefics3, SmolVLM images handling
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T17:18:51
| 2025-07-25T18:52:14
| 2025-07-25T18:52:14
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37291",
"html_url": "https://github.com/huggingface/transformers/pull/37291",
"diff_url": "https://github.com/huggingface/transformers/pull/37291.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37291.patch",
"merged_at": null
}
|
Simplify the handling of images in both processing and modeling.
Now the images/patches are flattened before being processed and passed to the models. This means that the image processing is simplified (no need for padding in the number of images/patches dimension), along with the modeling code ( No more padding images/patches containing only 0/False needing to be removed).
I tested thoroughly for each models with multiple images, batched images etc. and found no differences.
Cc @andimarafioti @orrzohar
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37291/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37291/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37290
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37290/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37290/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37290/events
|
https://github.com/huggingface/transformers/pull/37290
| 2,972,915,475
|
PR_kwDOCUB6oc6Rb9KH
| 37,290
|
Updated Model-card for donut
|
{
"login": "Logeswaran7",
"id": 74873758,
"node_id": "MDQ6VXNlcjc0ODczNzU4",
"avatar_url": "https://avatars.githubusercontent.com/u/74873758?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Logeswaran7",
"html_url": "https://github.com/Logeswaran7",
"followers_url": "https://api.github.com/users/Logeswaran7/followers",
"following_url": "https://api.github.com/users/Logeswaran7/following{/other_user}",
"gists_url": "https://api.github.com/users/Logeswaran7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Logeswaran7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Logeswaran7/subscriptions",
"organizations_url": "https://api.github.com/users/Logeswaran7/orgs",
"repos_url": "https://api.github.com/users/Logeswaran7/repos",
"events_url": "https://api.github.com/users/Logeswaran7/events{/privacy}",
"received_events_url": "https://api.github.com/users/Logeswaran7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T16:47:37
| 2025-04-07T18:54:47
| 2025-04-07T18:54:47
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37290",
"html_url": "https://github.com/huggingface/transformers/pull/37290",
"diff_url": "https://github.com/huggingface/transformers/pull/37290.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37290.patch",
"merged_at": "2025-04-07T18:54:47"
}
|
# What does this PR do?
This PR updates the model-card for the DONUT model, as described in [#36979](https://github.com/huggingface/transformers/issues/36979), in an attempt to standardize all model-cards.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37290/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37290/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37289
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37289/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37289/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37289/events
|
https://github.com/huggingface/transformers/pull/37289
| 2,972,823,633
|
PR_kwDOCUB6oc6RbpWq
| 37,289
|
[Fast Processor] OWLv2
|
{
"login": "Reshan123",
"id": 39221699,
"node_id": "MDQ6VXNlcjM5MjIxNjk5",
"avatar_url": "https://avatars.githubusercontent.com/u/39221699?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Reshan123",
"html_url": "https://github.com/Reshan123",
"followers_url": "https://api.github.com/users/Reshan123/followers",
"following_url": "https://api.github.com/users/Reshan123/following{/other_user}",
"gists_url": "https://api.github.com/users/Reshan123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Reshan123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Reshan123/subscriptions",
"organizations_url": "https://api.github.com/users/Reshan123/orgs",
"repos_url": "https://api.github.com/users/Reshan123/repos",
"events_url": "https://api.github.com/users/Reshan123/events{/privacy}",
"received_events_url": "https://api.github.com/users/Reshan123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T16:04:28
| 2025-07-25T02:58:24
| 2025-07-25T02:58:24
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37289",
"html_url": "https://github.com/huggingface/transformers/pull/37289",
"diff_url": "https://github.com/huggingface/transformers/pull/37289.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37289.patch",
"merged_at": null
}
|
# What does this PR do?
Adding fast processor for OWLv2
Linking: https://github.com/huggingface/transformers/issues/36978
# Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
CC: @yonigozlan
|
{
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37289/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37289/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37285
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37285/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37285/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37285/events
|
https://github.com/huggingface/transformers/pull/37285
| 2,972,708,663
|
PR_kwDOCUB6oc6RbPwF
| 37,285
|
Fixing flex attention for torch=2.6.0
|
{
"login": "SalmanMohammadi",
"id": 25081738,
"node_id": "MDQ6VXNlcjI1MDgxNzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/25081738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SalmanMohammadi",
"html_url": "https://github.com/SalmanMohammadi",
"followers_url": "https://api.github.com/users/SalmanMohammadi/followers",
"following_url": "https://api.github.com/users/SalmanMohammadi/following{/other_user}",
"gists_url": "https://api.github.com/users/SalmanMohammadi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SalmanMohammadi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SalmanMohammadi/subscriptions",
"organizations_url": "https://api.github.com/users/SalmanMohammadi/orgs",
"repos_url": "https://api.github.com/users/SalmanMohammadi/repos",
"events_url": "https://api.github.com/users/SalmanMohammadi/events{/privacy}",
"received_events_url": "https://api.github.com/users/SalmanMohammadi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 8103865784,
"node_id": "LA_kwDOCUB6oc8AAAAB4wctuA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/for%20patch",
"name": "for patch",
"color": "D93F0B",
"default": false,
"description": "Tag issues / labels that should be included in the next patch"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T15:21:56
| 2025-04-07T21:04:46
| 2025-04-07T21:04:46
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37285",
"html_url": "https://github.com/huggingface/transformers/pull/37285",
"diff_url": "https://github.com/huggingface/transformers/pull/37285.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37285.patch",
"merged_at": "2025-04-07T21:04:46"
}
|
# What does this PR do?
This PR fixes this issue originally raised in pytorch core https://github.com/pytorch/pytorch/issues/146260 for flex attention on 2.6.0, by setting `mode="max-autotune-no-cudagraphs"`.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37285/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37285/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37284
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37284/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37284/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37284/events
|
https://github.com/huggingface/transformers/pull/37284
| 2,972,683,982
|
PR_kwDOCUB6oc6RbKJu
| 37,284
|
fix deepspeed job
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T15:14:30
| 2025-04-08T13:37:20
| 2025-04-08T13:19:33
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37284",
"html_url": "https://github.com/huggingface/transformers/pull/37284",
"diff_url": "https://github.com/huggingface/transformers/pull/37284.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37284.patch",
"merged_at": "2025-04-08T13:19:33"
}
|
# What does this PR do?
Wrong `working_directory` and got
> OCI runtime exec failed: exec failed: unable to start container process: chdir to cwd ("/transformers") set in config.json failed: no such file or directory: unknown
Also install some packages at specific versions.
The pytest command is running with these changes
https://github.com/huggingface/transformers/actions/runs/14269422275/job/39998982497
I will leave you people to update the deepspeed docker image however: it's quite old.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37284/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37283
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37283/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37283/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37283/events
|
https://github.com/huggingface/transformers/pull/37283
| 2,972,626,779
|
PR_kwDOCUB6oc6Ra9mi
| 37,283
|
Apply several ruff SIM rules
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T14:52:30
| 2025-09-03T14:20:04
| 2025-07-29T11:40:34
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37283",
"html_url": "https://github.com/huggingface/transformers/pull/37283",
"diff_url": "https://github.com/huggingface/transformers/pull/37283.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37283.patch",
"merged_at": "2025-07-29T11:40:34"
}
|
# What does this PR do?
The following rules are used:
`SIM118` checks for key-existence checks against dict.keys() calls.
`SIM101` duplicate-isinstance-call.
`SIM910` Checks for dict.get() calls that pass None as the default value.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37283/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37283/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37282
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37282/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37282/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37282/events
|
https://github.com/huggingface/transformers/pull/37282
| 2,972,602,687
|
PR_kwDOCUB6oc6Ra4UY
| 37,282
|
:rotating_light: :rotating_light: Setup -> setupclass conversion
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T14:43:23
| 2025-04-08T16:15:44
| 2025-04-08T16:15:38
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37282",
"html_url": "https://github.com/huggingface/transformers/pull/37282",
"diff_url": "https://github.com/huggingface/transformers/pull/37282.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37282.patch",
"merged_at": "2025-04-08T16:15:38"
}
|
This PR tries replacing expensive `setUp()` methods, which are run once per test, with `setUpClass()` methods that are run only once for the whole suite. It should improve test times and reduce instances of connection failures / DOSing the Hub.
This seems to reduce test runtime a lot for me, but there's a risk that it will cause issues in the slow nightly CI, so we shouldn't merge it before a major release and we should watch for issues.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37282/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37282/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37281
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37281/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37281/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37281/events
|
https://github.com/huggingface/transformers/pull/37281
| 2,972,493,517
|
PR_kwDOCUB6oc6RagJ1
| 37,281
|
Fix deepspeed loading
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T14:07:03
| 2025-04-05T21:24:20
| 2025-04-05T15:05:45
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37281",
"html_url": "https://github.com/huggingface/transformers/pull/37281",
"diff_url": "https://github.com/huggingface/transformers/pull/37281.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37281.patch",
"merged_at": "2025-04-05T15:05:45"
}
|
# What does this PR do?
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37281/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37281/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37280
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37280/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37280/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37280/events
|
https://github.com/huggingface/transformers/pull/37280
| 2,972,428,801
|
PR_kwDOCUB6oc6RaR5a
| 37,280
|
Fix llava_onevision tests
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T13:44:08
| 2025-04-04T14:10:40
| 2025-04-04T14:03:38
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37280",
"html_url": "https://github.com/huggingface/transformers/pull/37280",
"diff_url": "https://github.com/huggingface/transformers/pull/37280.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37280.patch",
"merged_at": "2025-04-04T14:03:38"
}
|
The `llava_onevision` tests are loading and saving the processor/tokenizer for every test, which creates a lot of slowdown and hammers the Hub very quickly. This creates the potential for intermittent test failures. This PR refactors the tests to save the processor only once, instead of once per test.
This reduces the llava_onevision processor test runtime on my local machine from 95s to 45s.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37280/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37280/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37279
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37279/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37279/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37279/events
|
https://github.com/huggingface/transformers/pull/37279
| 2,972,394,204
|
PR_kwDOCUB6oc6RaKa5
| 37,279
|
Use Python 3.9 syntax in examples
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T13:30:10
| 2025-04-07T11:58:16
| 2025-04-07T11:52:21
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37279",
"html_url": "https://github.com/huggingface/transformers/pull/37279",
"diff_url": "https://github.com/huggingface/transformers/pull/37279.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37279.patch",
"merged_at": "2025-04-07T11:52:21"
}
|
# What does this PR do?
A follow-up work to fix examples.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37279/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37279/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37278
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37278/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37278/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37278/events
|
https://github.com/huggingface/transformers/pull/37278
| 2,972,347,344
|
PR_kwDOCUB6oc6RaAN5
| 37,278
|
Allow rocm systems to run these tests
|
{
"login": "ivarflakstad",
"id": 69173633,
"node_id": "MDQ6VXNlcjY5MTczNjMz",
"avatar_url": "https://avatars.githubusercontent.com/u/69173633?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivarflakstad",
"html_url": "https://github.com/ivarflakstad",
"followers_url": "https://api.github.com/users/ivarflakstad/followers",
"following_url": "https://api.github.com/users/ivarflakstad/following{/other_user}",
"gists_url": "https://api.github.com/users/ivarflakstad/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivarflakstad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivarflakstad/subscriptions",
"organizations_url": "https://api.github.com/users/ivarflakstad/orgs",
"repos_url": "https://api.github.com/users/ivarflakstad/repos",
"events_url": "https://api.github.com/users/ivarflakstad/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivarflakstad/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T13:11:44
| 2025-04-10T11:33:03
| 2025-04-10T11:33:02
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37278",
"html_url": "https://github.com/huggingface/transformers/pull/37278",
"diff_url": "https://github.com/huggingface/transformers/pull/37278.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37278.patch",
"merged_at": "2025-04-10T11:33:01"
}
|
# What does this PR do?
Some tests were skipped on AMD gpus because they checked `torch.version.cuda` only.
Now we use `IS_CUDA_SYSTEM or IS_ROCM_SYSTEM`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37278/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37278/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37277
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37277/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37277/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37277/events
|
https://github.com/huggingface/transformers/pull/37277
| 2,972,264,862
|
PR_kwDOCUB6oc6RZuLz
| 37,277
|
byebye torch 2.0
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T12:37:48
| 2025-04-07T13:19:49
| 2025-04-07T13:19:47
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37277",
"html_url": "https://github.com/huggingface/transformers/pull/37277",
"diff_url": "https://github.com/huggingface/transformers/pull/37277.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37277.patch",
"merged_at": "2025-04-07T13:19:47"
}
|
# What does this PR do?
For #37137
Fixes #37238
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37277/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37277/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37276
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37276/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37276/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37276/events
|
https://github.com/huggingface/transformers/pull/37276
| 2,972,096,611
|
PR_kwDOCUB6oc6RZKQ6
| 37,276
|
[Tests] flaky `test_constrained_beam_search_generate_dict_output`
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T11:29:36
| 2025-04-04T12:38:48
| 2025-04-04T12:38:42
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37276",
"html_url": "https://github.com/huggingface/transformers/pull/37276",
"diff_url": "https://github.com/huggingface/transformers/pull/37276.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37276.patch",
"merged_at": "2025-04-04T12:38:42"
}
|
# What does this PR do?
Adds `@is_flaky()` to `test_constrained_beam_search_generate_dict_output`. Flake reason in the diff :)
<img width="1049" alt="Screenshot 2025-04-04 at 12 28 29" src="https://github.com/user-attachments/assets/5d729363-bc68-4712-bafe-2b39cc585752" />
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37276/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37276/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37275
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37275/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37275/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37275/events
|
https://github.com/huggingface/transformers/pull/37275
| 2,971,994,160
|
PR_kwDOCUB6oc6RYzYW
| 37,275
|
[chat-template] Unify tests and clean up 🧼
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T10:47:05
| 2025-04-10T12:42:32
| 2025-04-10T12:42:32
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37275",
"html_url": "https://github.com/huggingface/transformers/pull/37275",
"diff_url": "https://github.com/huggingface/transformers/pull/37275.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37275.patch",
"merged_at": "2025-04-10T12:42:32"
}
|
# What does this PR do?
As per title, we added the video loading logic in rush for SmolVLM. But the API is a bit over-complicated, so the decision was to clean up later. As the first step of clean up, we will
- Remove `sampling_fn` from public kwargs, I don't think anyone uses it so we will not deprecate
- Squash typed dict manipulations into a smaller helper
- Put video loading for each model in model's processor. Until the PR for adding separate API for video processors is merged, we'll keep it private under processor. In the future, each video processor can have its own logic to load videos, following model-specific treatment
- Clean up tests to load only slow image/video processors. As we are moving to fast-only image processors, we need a way to run tests without relying only on torch. For torch inputs we already have `test_chat_template_return_dict_torch`
- Unify similar test under one and parameterize
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37275/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37275/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37274
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37274/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37274/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37274/events
|
https://github.com/huggingface/transformers/pull/37274
| 2,971,939,004
|
PR_kwDOCUB6oc6RYnQd
| 37,274
|
pin specific `natten` version in docker file
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T10:20:54
| 2025-04-04T11:47:18
| 2025-04-04T11:47:16
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37274",
"html_url": "https://github.com/huggingface/transformers/pull/37274",
"diff_url": "https://github.com/huggingface/transformers/pull/37274.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37274.patch",
"merged_at": "2025-04-04T11:47:16"
}
|
# What does this PR do?
We suddenly get ``cannot import name 'natten2dav' from 'natten.functional'``, so `dinat` test get `Error` at the beginning of pytest collection, not giving any output, and the slack report job fails due to missing files and we don't receive any report.
I will try to avoid such failure in a separate PR, but let's just quickly pin `natten`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37274/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37274/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37272
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37272/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37272/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37272/events
|
https://github.com/huggingface/transformers/pull/37272
| 2,971,904,261
|
PR_kwDOCUB6oc6RYfgu
| 37,272
|
Fix `utils/check_bad_commit.py`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T10:07:50
| 2025-04-04T10:33:38
| 2025-04-04T10:18:20
|
COLLABORATOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37272",
"html_url": "https://github.com/huggingface/transformers/pull/37272",
"diff_url": "https://github.com/huggingface/transformers/pull/37272.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37272.patch",
"merged_at": "2025-04-04T10:18:20"
}
|
# What does this PR do?
See the comment in the change.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37272/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37272/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37271
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37271/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37271/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37271/events
|
https://github.com/huggingface/transformers/pull/37271
| 2,971,900,804
|
PR_kwDOCUB6oc6RYeuV
| 37,271
|
Fix `utils/check_bad_commit.py`
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T10:06:44
| 2025-04-04T10:33:00
| 2025-04-04T10:07:11
|
COLLABORATOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37271",
"html_url": "https://github.com/huggingface/transformers/pull/37271",
"diff_url": "https://github.com/huggingface/transformers/pull/37271.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37271.patch",
"merged_at": null
}
|
# What does this PR do?
See the comment in the change.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37271/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37271/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37270
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37270/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37270/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37270/events
|
https://github.com/huggingface/transformers/pull/37270
| 2,971,504,826
|
PR_kwDOCUB6oc6RXJ7Z
| 37,270
|
36978 | Fast processor, Vivit
|
{
"login": "samrae7",
"id": 4126146,
"node_id": "MDQ6VXNlcjQxMjYxNDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4126146?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/samrae7",
"html_url": "https://github.com/samrae7",
"followers_url": "https://api.github.com/users/samrae7/followers",
"following_url": "https://api.github.com/users/samrae7/following{/other_user}",
"gists_url": "https://api.github.com/users/samrae7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/samrae7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/samrae7/subscriptions",
"organizations_url": "https://api.github.com/users/samrae7/orgs",
"repos_url": "https://api.github.com/users/samrae7/repos",
"events_url": "https://api.github.com/users/samrae7/events{/privacy}",
"received_events_url": "https://api.github.com/users/samrae7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T07:21:20
| 2025-04-07T09:05:43
| 2025-04-07T09:05:43
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37270",
"html_url": "https://github.com/huggingface/transformers/pull/37270",
"diff_url": "https://github.com/huggingface/transformers/pull/37270.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37270.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "samrae7",
"id": 4126146,
"node_id": "MDQ6VXNlcjQxMjYxNDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4126146?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/samrae7",
"html_url": "https://github.com/samrae7",
"followers_url": "https://api.github.com/users/samrae7/followers",
"following_url": "https://api.github.com/users/samrae7/following{/other_user}",
"gists_url": "https://api.github.com/users/samrae7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/samrae7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/samrae7/subscriptions",
"organizations_url": "https://api.github.com/users/samrae7/orgs",
"repos_url": "https://api.github.com/users/samrae7/repos",
"events_url": "https://api.github.com/users/samrae7/events{/privacy}",
"received_events_url": "https://api.github.com/users/samrae7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37270/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37270/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37269
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37269/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37269/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37269/events
|
https://github.com/huggingface/transformers/issues/37269
| 2,971,497,966
|
I_kwDOCUB6oc6xHXXu
| 37,269
|
KV cache size is not updating?
|
{
"login": "tianhaoz95",
"id": 16887772,
"node_id": "MDQ6VXNlcjE2ODg3Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/16887772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tianhaoz95",
"html_url": "https://github.com/tianhaoz95",
"followers_url": "https://api.github.com/users/tianhaoz95/followers",
"following_url": "https://api.github.com/users/tianhaoz95/following{/other_user}",
"gists_url": "https://api.github.com/users/tianhaoz95/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tianhaoz95/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tianhaoz95/subscriptions",
"organizations_url": "https://api.github.com/users/tianhaoz95/orgs",
"repos_url": "https://api.github.com/users/tianhaoz95/repos",
"events_url": "https://api.github.com/users/tianhaoz95/events{/privacy}",
"received_events_url": "https://api.github.com/users/tianhaoz95/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T07:18:16
| 2025-04-05T05:29:40
| 2025-04-05T05:29:39
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.50.0.dev0
- Platform: macOS-15.3.2-arm64-arm-64bit
- Python version: 3.12.9
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: 1.5.2
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
I'm experimenting with using kv cache explicitly (this is for prefill decode disaggregation) with the following code:
```python
def setup_model(device_map="cpu"):
tokenizer = AutoTokenizer.from_pretrained("google/gemma-3-1b-it")
model = AutoModelForCausalLM.from_pretrained(
"google/gemma-3-1b-it",
torch_dtype=torch.float16,
device_map=device_map,
)
return model, tokenizer
model, tokenizer = setup_model(device_map="cpu")
prompt = "Write a short poem about coding:"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model(
input_ids=inputs.input_ids,
use_cache=True,
return_dict=True
)
print(f"Prefill kv cache, kv cache size: {len(outputs.past_key_values.key_cache)}, and shape: {outputs.past_key_values.key_cache[0].shape}")
past_key_values = outputs.past_key_values
generated_tokens = []
next_token_logits = outputs.logits[:, -1, :]
for _ in range(50):
with torch.no_grad():
next_token_id = torch.argmax(next_token_logits, dim=-1)
if next_token_id.item() == tokenizer.eos_token_id:
break
generated_tokens.append(next_token_id.item())
outputs = model(
input_ids=next_token_id.unsqueeze(0),
use_cache=True,
past_key_values=past_key_values,
return_dict=True
)
print(f"Update kv cache, kv cache size: {len(outputs.past_key_values.key_cache)}, and shape: {outputs.past_key_values.key_cache[0].shape}")
past_key_values = outputs.past_key_values
next_token_logits = outputs.logits[:, -1, :]
all_tokens = torch.cat([inputs.input_ids[0], torch.tensor(generated_tokens)])
generated_text = tokenizer.decode(all_tokens)
print(f"\nGenerated text:\n{generated_text}")
```
I found the output is:
```
Prefill kv cache, kv cache size: 26, and shape: torch.Size([1, 1, 8, 256])
Traceback (most recent call last):
File "/Users/tianhaoz/Downloads/medium/pd_disaggregation/inspect_kv_cache.py", line 79, in <module>
main()
File "/Users/tianhaoz/Downloads/medium/pd_disaggregation/inspect_kv_cache.py", line 63, in main
outputs = model(
^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/utils/deprecation.py", line 172, in wrapped_func
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 976, in forward
outputs = self.model(
^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 754, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 443, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/models/gemma3/modeling_gemma3.py", line 347, in forward
key_states, value_states = past_key_value.update(key_states, value_states, self.layer_idx, cache_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/cache_utils.py", line 1721, in update
return update_fn(
^^^^^^^^^^
File "/opt/anaconda3/envs/llm/lib/python3.12/site-packages/transformers/cache_utils.py", line 1687, in _static_update
k_out[:, :, cache_position] = key_states
~~~~~^^^^^^^^^^^^^^^^^^^^^^
IndexError: index 8 is out of bounds for dimension 0 with size 8
```
does this mean in the decode steps, the sequence length of kv cache is not updating? looks like it might be forever staying at torch.Size([1, 1, 8, 256]) after prefill. Am I doing something wrong?
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Run the example code in the last section.
### Expected behavior
It should generate result and print updated kv cache size each time.
|
{
"login": "tianhaoz95",
"id": 16887772,
"node_id": "MDQ6VXNlcjE2ODg3Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/16887772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tianhaoz95",
"html_url": "https://github.com/tianhaoz95",
"followers_url": "https://api.github.com/users/tianhaoz95/followers",
"following_url": "https://api.github.com/users/tianhaoz95/following{/other_user}",
"gists_url": "https://api.github.com/users/tianhaoz95/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tianhaoz95/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tianhaoz95/subscriptions",
"organizations_url": "https://api.github.com/users/tianhaoz95/orgs",
"repos_url": "https://api.github.com/users/tianhaoz95/repos",
"events_url": "https://api.github.com/users/tianhaoz95/events{/privacy}",
"received_events_url": "https://api.github.com/users/tianhaoz95/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37269/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37269/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37268
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37268/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37268/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37268/events
|
https://github.com/huggingface/transformers/pull/37268
| 2,971,449,636
|
PR_kwDOCUB6oc6RW9w8
| 37,268
|
[qwen-vl] Standardize config
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T06:55:43
| 2025-04-17T07:38:13
| 2025-04-17T07:38:13
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37268",
"html_url": "https://github.com/huggingface/transformers/pull/37268",
"diff_url": "https://github.com/huggingface/transformers/pull/37268.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37268.patch",
"merged_at": "2025-04-17T07:38:12"
}
|
# What does this PR do?
BC is kept and the models can be loaded as before. All attributes are still available though general config (`config.vocab_size`) but in model code we use `config.text_config` now
Separating out the text config from the general config allows us to support multimodality and text as separate entities, with their own base classes and configs. Related to #37033
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37268/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37268/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37267
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37267/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37267/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37267/events
|
https://github.com/huggingface/transformers/pull/37267
| 2,971,378,603
|
PR_kwDOCUB6oc6RWuU5
| 37,267
|
Add Magma Agentic Model from Microsoft
|
{
"login": "jwyang",
"id": 3894247,
"node_id": "MDQ6VXNlcjM4OTQyNDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3894247?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jwyang",
"html_url": "https://github.com/jwyang",
"followers_url": "https://api.github.com/users/jwyang/followers",
"following_url": "https://api.github.com/users/jwyang/following{/other_user}",
"gists_url": "https://api.github.com/users/jwyang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jwyang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jwyang/subscriptions",
"organizations_url": "https://api.github.com/users/jwyang/orgs",
"repos_url": "https://api.github.com/users/jwyang/repos",
"events_url": "https://api.github.com/users/jwyang/events{/privacy}",
"received_events_url": "https://api.github.com/users/jwyang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-04T06:15:41
| 2025-05-13T19:06:50
| null |
NONE
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37267",
"html_url": "https://github.com/huggingface/transformers/pull/37267",
"diff_url": "https://github.com/huggingface/transformers/pull/37267.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37267.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Add [Microsoft Magma model](https://github.com/microsoft/Magma) to the repository
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37267/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37267/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37266
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37266/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37266/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37266/events
|
https://github.com/huggingface/transformers/pull/37266
| 2,971,121,078
|
PR_kwDOCUB6oc6RV3Yx
| 37,266
|
Enable RUF013 to enforce optional typing
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-04T02:32:57
| 2025-07-13T09:28:48
| 2025-05-08T10:39:57
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37266",
"html_url": "https://github.com/huggingface/transformers/pull/37266",
"diff_url": "https://github.com/huggingface/transformers/pull/37266.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37266.patch",
"merged_at": "2025-05-08T10:39:56"
}
|
# What does this PR do?
This PR adds Optional to remaining types and add `RUF013` rule to enforce that.
RUFF rule `RUF013` checks for the use of implicit Optional in type annotations when the default parameter value is None.
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37266/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37266/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37265
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37265/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37265/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37265/events
|
https://github.com/huggingface/transformers/issues/37265
| 2,970,736,285
|
I_kwDOCUB6oc6xEdad
| 37,265
|
Same data but has large different while evaluating in the training stage vs evaluate it standalone from read the finetuned-model
|
{
"login": "irmathebest",
"id": 96667825,
"node_id": "U_kgDOBcMIsQ",
"avatar_url": "https://avatars.githubusercontent.com/u/96667825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/irmathebest",
"html_url": "https://github.com/irmathebest",
"followers_url": "https://api.github.com/users/irmathebest/followers",
"following_url": "https://api.github.com/users/irmathebest/following{/other_user}",
"gists_url": "https://api.github.com/users/irmathebest/gists{/gist_id}",
"starred_url": "https://api.github.com/users/irmathebest/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/irmathebest/subscriptions",
"organizations_url": "https://api.github.com/users/irmathebest/orgs",
"repos_url": "https://api.github.com/users/irmathebest/repos",
"events_url": "https://api.github.com/users/irmathebest/events{/privacy}",
"received_events_url": "https://api.github.com/users/irmathebest/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T21:36:41
| 2025-05-12T08:02:29
| 2025-05-12T08:02:29
|
NONE
| null | null | null | null |
Hi team.
I have stuck on this problem for a whole week and still cannot figure out why.
Env: python 3.8, transformer -- 4.28
I am using the XLMRobertA Base for finetuning the model for a multi-class classification.
However,
when in the training step, I run trainer.evaluate() it shows the accuracy is 68% while in the evaluate standalone, which it reads the base model and then make the prediction and evaluate it, the accuracy drops to 30%. Is there any reason why it happens, or it's a bug?
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37265/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37265/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37264
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37264/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37264/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37264/events
|
https://github.com/huggingface/transformers/pull/37264
| 2,970,669,476
|
PR_kwDOCUB6oc6RUVPe
| 37,264
|
clarify error message to ensure min 28x28 image supplied
|
{
"login": "rymc",
"id": 3084015,
"node_id": "MDQ6VXNlcjMwODQwMTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3084015?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rymc",
"html_url": "https://github.com/rymc",
"followers_url": "https://api.github.com/users/rymc/followers",
"following_url": "https://api.github.com/users/rymc/following{/other_user}",
"gists_url": "https://api.github.com/users/rymc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rymc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rymc/subscriptions",
"organizations_url": "https://api.github.com/users/rymc/orgs",
"repos_url": "https://api.github.com/users/rymc/repos",
"events_url": "https://api.github.com/users/rymc/events{/privacy}",
"received_events_url": "https://api.github.com/users/rymc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 7570656740,
"node_id": "LA_kwDOCUB6oc8AAAABwz8N5A",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Processing",
"name": "Processing",
"color": "1E17DF",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T20:54:35
| 2025-04-04T11:54:01
| 2025-04-04T11:53:39
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37264",
"html_url": "https://github.com/huggingface/transformers/pull/37264",
"diff_url": "https://github.com/huggingface/transformers/pull/37264.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37264.patch",
"merged_at": "2025-04-04T11:53:39"
}
|
This PR clarifies that qwen 2.5 VL requires images of at least 28x28. The current error message has an OR rather than AND which can confuse users.
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37264/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37264/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37263
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37263/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37263/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37263/events
|
https://github.com/huggingface/transformers/issues/37263
| 2,970,537,847
|
I_kwDOCUB6oc6xDs93
| 37,263
|
Loading HQQ quantized models is broken since #35926
|
{
"login": "mobicham",
"id": 37179323,
"node_id": "MDQ6VXNlcjM3MTc5MzIz",
"avatar_url": "https://avatars.githubusercontent.com/u/37179323?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mobicham",
"html_url": "https://github.com/mobicham",
"followers_url": "https://api.github.com/users/mobicham/followers",
"following_url": "https://api.github.com/users/mobicham/following{/other_user}",
"gists_url": "https://api.github.com/users/mobicham/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mobicham/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mobicham/subscriptions",
"organizations_url": "https://api.github.com/users/mobicham/orgs",
"repos_url": "https://api.github.com/users/mobicham/repos",
"events_url": "https://api.github.com/users/mobicham/events{/privacy}",
"received_events_url": "https://api.github.com/users/mobicham/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T19:54:55
| 2025-06-15T08:03:19
| 2025-06-15T08:03:19
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.51.0.dev0
- Platform: Linux-5.4.0-208-generic-x86_64-with-glibc2.35
- Python version: 3.11.10
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
Loading HQQ models is broken since https://github.com/huggingface/transformers/pull/35926
Not sure what changed, probably something in `modeling_utils`
@SunMarc @ArthurZucker
### Reproduction
```Python
import torch
compute_dtype = torch.bfloat16
model_id = 'mobiuslabsgmbh/gemma-3-12b-it_4bitgs64_bfp16_hqq_hf'
#Load model
from transformers import Gemma3ForConditionalGeneration, AutoProcessor
processor = AutoProcessor.from_pretrained(model_id)
model = Gemma3ForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=compute_dtype,
attn_implementation="sdpa",
device_map="cuda",
)
```
```
AttributeError: `language_model.model.layers.46.self_attn.o_proj.quant_scale` is neither a parameter nor a buffer.
```
### Expected behavior
HQQ quantized models were loading fine before #35926
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37263/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37263/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37262
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37262/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37262/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37262/events
|
https://github.com/huggingface/transformers/issues/37262
| 2,970,417,352
|
I_kwDOCUB6oc6xDPjI
| 37,262
|
Add support for higher jax and flax version
|
{
"login": "rxng8",
"id": 60036798,
"node_id": "MDQ6VXNlcjYwMDM2Nzk4",
"avatar_url": "https://avatars.githubusercontent.com/u/60036798?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rxng8",
"html_url": "https://github.com/rxng8",
"followers_url": "https://api.github.com/users/rxng8/followers",
"following_url": "https://api.github.com/users/rxng8/following{/other_user}",
"gists_url": "https://api.github.com/users/rxng8/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rxng8/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rxng8/subscriptions",
"organizations_url": "https://api.github.com/users/rxng8/orgs",
"repos_url": "https://api.github.com/users/rxng8/repos",
"events_url": "https://api.github.com/users/rxng8/events{/privacy}",
"received_events_url": "https://api.github.com/users/rxng8/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 2934977194,
"node_id": "MDU6TGFiZWwyOTM0OTc3MTk0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flax",
"name": "Flax",
"color": "4862AD",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T18:51:23
| 2025-06-16T13:10:09
| 2025-06-16T13:10:08
|
NONE
| null | null | null | null |
### Feature request
Currently, transformer does not support jax version greater than 0.4.13, the libraries has changed and no longer compatible
### Motivation
Currently, transformer does not support jax version greater than 0.4.13, the libraries has changed and no longer compatible
### Your contribution
N/A
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37262/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37262/timeline
| null |
not_planned
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37261
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37261/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37261/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37261/events
|
https://github.com/huggingface/transformers/pull/37261
| 2,970,407,467
|
PR_kwDOCUB6oc6RTa4k
| 37,261
|
Updated T5 model card with standardized format
|
{
"login": "ShararehY",
"id": 87082275,
"node_id": "MDQ6VXNlcjg3MDgyMjc1",
"avatar_url": "https://avatars.githubusercontent.com/u/87082275?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ShararehY",
"html_url": "https://github.com/ShararehY",
"followers_url": "https://api.github.com/users/ShararehY/followers",
"following_url": "https://api.github.com/users/ShararehY/following{/other_user}",
"gists_url": "https://api.github.com/users/ShararehY/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ShararehY/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ShararehY/subscriptions",
"organizations_url": "https://api.github.com/users/ShararehY/orgs",
"repos_url": "https://api.github.com/users/ShararehY/repos",
"events_url": "https://api.github.com/users/ShararehY/events{/privacy}",
"received_events_url": "https://api.github.com/users/ShararehY/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T18:46:18
| 2025-04-04T22:23:09
| 2025-04-04T22:23:09
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37261",
"html_url": "https://github.com/huggingface/transformers/pull/37261",
"diff_url": "https://github.com/huggingface/transformers/pull/37261.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37261.patch",
"merged_at": "2025-04-04T22:23:09"
}
|
# What does this PR do?
Updated T5 model card with standardized format
Fixes #36979
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@stevhliu
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37261/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37261/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37260
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37260/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37260/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37260/events
|
https://github.com/huggingface/transformers/pull/37260
| 2,970,355,648
|
PR_kwDOCUB6oc6RTPbP
| 37,260
|
fix: use mtime by default in Trainer._rotate_checkpoints with automatic fallback
|
{
"login": "Jerry-Terrasse",
"id": 37892712,
"node_id": "MDQ6VXNlcjM3ODkyNzEy",
"avatar_url": "https://avatars.githubusercontent.com/u/37892712?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jerry-Terrasse",
"html_url": "https://github.com/Jerry-Terrasse",
"followers_url": "https://api.github.com/users/Jerry-Terrasse/followers",
"following_url": "https://api.github.com/users/Jerry-Terrasse/following{/other_user}",
"gists_url": "https://api.github.com/users/Jerry-Terrasse/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jerry-Terrasse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jerry-Terrasse/subscriptions",
"organizations_url": "https://api.github.com/users/Jerry-Terrasse/orgs",
"repos_url": "https://api.github.com/users/Jerry-Terrasse/repos",
"events_url": "https://api.github.com/users/Jerry-Terrasse/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jerry-Terrasse/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T18:22:59
| 2025-04-10T15:42:07
| 2025-04-10T15:42:06
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37260",
"html_url": "https://github.com/huggingface/transformers/pull/37260",
"diff_url": "https://github.com/huggingface/transformers/pull/37260.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37260.patch",
"merged_at": "2025-04-10T15:42:06"
}
|
## What does this PR do?
This PR fixes an issue with checkpoint rotation in `transformers.Trainer`. When training with checkpoints saved every 100 steps and a maximum limit of 3 checkpoints, starting a new training session in the same output directory can cause a newly created checkpoint (e.g., `checkpoint-100`) to be mistakenly identified as the oldest and immediately deleted.
### Detailed Problem Description
- **Scenario**: Training with `Trainer` using `save_steps=100` and `save_total_limit=3`.
- **Issue**: After a training run with 700 steps, checkpoints `checkpoint-500`, `checkpoint-600`, and `checkpoint-700` are produced. When starting a new training run in the same output directory, the new checkpoint (`checkpoint-100`) is mistakenly identified as the **oldest** checkpoint due to its lower numerical value and is **immediately deleted**.
- **Cause**: The current checkpoint rotation mechanism relies solely on the numerical ordering extracted from the checkpoint directory names, which ignores the actual creation times of the checkpoints.
### Proposed Solution
- **Default to mtime**: Modify `_rotate_checkpoints` to use file modification time (mtime) by default for ordering checkpoints. This approach better reflects the actual creation order.
- **Automatic Fallback**: If the mtime values appear unreliable (for example, if they are identical or show insufficient differences on non-POSIX filesystems such as FUSE FS to HTTP blob storage), the system will automatically fall back to the original numerical ordering method.
### Related Issue
This PR is related to https://github.com/huggingface/transformers/issues/26961 and https://github.com/huggingface/transformers/pull/28862 . In that issue, users observed that using `use_mtime=True` sometimes resulted in the unintended deletion of newer checkpoints. Although setting `use_mtime=False` could avoid these issues on certain filesystems, our solution defaults to `use_mtime=True` to accurately reflect checkpoint creation order, with an automatic fallback mechanism to ensure robustness when mtime is unreliable.
---
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request), Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the [documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and [here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
|
{
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37260/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37260/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37259
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37259/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37259/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37259/events
|
https://github.com/huggingface/transformers/pull/37259
| 2,970,330,192
|
PR_kwDOCUB6oc6RTJ4H
| 37,259
|
Bye bye env vars, keep everything as configs
|
{
"login": "zach-huggingface",
"id": 205341649,
"node_id": "U_kgDODD1D0Q",
"avatar_url": "https://avatars.githubusercontent.com/u/205341649?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zach-huggingface",
"html_url": "https://github.com/zach-huggingface",
"followers_url": "https://api.github.com/users/zach-huggingface/followers",
"following_url": "https://api.github.com/users/zach-huggingface/following{/other_user}",
"gists_url": "https://api.github.com/users/zach-huggingface/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zach-huggingface/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zach-huggingface/subscriptions",
"organizations_url": "https://api.github.com/users/zach-huggingface/orgs",
"repos_url": "https://api.github.com/users/zach-huggingface/repos",
"events_url": "https://api.github.com/users/zach-huggingface/events{/privacy}",
"received_events_url": "https://api.github.com/users/zach-huggingface/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-03T18:09:27
| 2025-04-24T12:55:22
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37259",
"html_url": "https://github.com/huggingface/transformers/pull/37259",
"diff_url": "https://github.com/huggingface/transformers/pull/37259.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37259.patch",
"merged_at": null
}
|
# What does this PR do?
As @BenjaminBossan found, (https://github.com/huggingface/accelerate/pull/3252), the `TrainingArguments` will set environmental variables automatically when using Accelerate because before it wouldn't work otherwise. Nowadays the only env variable required for things to run smoothly is the ones for model init (fsdp cpu eff ram).
This PR does a few things:
1. We completely remove the need for environmental variables, creating the proper configs (dynamo relies on https://github.com/huggingface/accelerate/pull/3251)
2. I've refactored how `mixed_precision` gets set, to simplify the training arguments and combine 7 args into 2.
3. Removes old references/updates the logic in `Trainer` to reflect the choices
Full version of https://github.com/huggingface/transformers/pull/34886
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@SunMarc @Rocketknight1
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37259/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37259/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37258
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37258/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37258/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37258/events
|
https://github.com/huggingface/transformers/pull/37258
| 2,970,203,816
|
PR_kwDOCUB6oc6RSufm
| 37,258
|
[qwen-vl] fix image processor
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T17:03:58
| 2025-04-03T17:48:56
| 2025-04-03T17:48:56
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37258",
"html_url": "https://github.com/huggingface/transformers/pull/37258",
"diff_url": "https://github.com/huggingface/transformers/pull/37258.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37258.patch",
"merged_at": "2025-04-03T17:48:56"
}
|
# What does this PR do?
The example in the docs for configuring image size doesn't work anymore after the last refactor (https://huggingface.co/docs/transformers/en/model_doc/qwen2_vl#image-resolution-trade-off). This PR ensure backwards compatibility for qwen based image processors
Found when fixing Qwen-Omni which uses this processor class
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37258/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37258/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37256
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37256/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37256/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37256/events
|
https://github.com/huggingface/transformers/pull/37256
| 2,970,138,874
|
PR_kwDOCUB6oc6RSgfF
| 37,256
|
mobilebert model card update
|
{
"login": "Reshan123",
"id": 39221699,
"node_id": "MDQ6VXNlcjM5MjIxNjk5",
"avatar_url": "https://avatars.githubusercontent.com/u/39221699?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Reshan123",
"html_url": "https://github.com/Reshan123",
"followers_url": "https://api.github.com/users/Reshan123/followers",
"following_url": "https://api.github.com/users/Reshan123/following{/other_user}",
"gists_url": "https://api.github.com/users/Reshan123/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Reshan123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Reshan123/subscriptions",
"organizations_url": "https://api.github.com/users/Reshan123/orgs",
"repos_url": "https://api.github.com/users/Reshan123/repos",
"events_url": "https://api.github.com/users/Reshan123/events{/privacy}",
"received_events_url": "https://api.github.com/users/Reshan123/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T16:31:15
| 2025-04-04T21:28:35
| 2025-04-04T21:28:35
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37256",
"html_url": "https://github.com/huggingface/transformers/pull/37256",
"diff_url": "https://github.com/huggingface/transformers/pull/37256.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37256.patch",
"merged_at": "2025-04-04T21:28:35"
}
|
# What does this PR do?
As suggested in this issue - [Model cards #36979](https://github.com/huggingface/transformers/issues/36979) - this PR updates the documentation of the MobileBERT model, which will now be aligned with the standardized format for all the docs.
## Who can review?
@stevhliu, please let me know if any changes are needed
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37256/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37256/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37255
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37255/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37255/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37255/events
|
https://github.com/huggingface/transformers/pull/37255
| 2,970,132,963
|
PR_kwDOCUB6oc6RSfMl
| 37,255
|
Update OpenAI GPT model card
|
{
"login": "linnettuscano",
"id": 71842687,
"node_id": "MDQ6VXNlcjcxODQyNjg3",
"avatar_url": "https://avatars.githubusercontent.com/u/71842687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/linnettuscano",
"html_url": "https://github.com/linnettuscano",
"followers_url": "https://api.github.com/users/linnettuscano/followers",
"following_url": "https://api.github.com/users/linnettuscano/following{/other_user}",
"gists_url": "https://api.github.com/users/linnettuscano/gists{/gist_id}",
"starred_url": "https://api.github.com/users/linnettuscano/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/linnettuscano/subscriptions",
"organizations_url": "https://api.github.com/users/linnettuscano/orgs",
"repos_url": "https://api.github.com/users/linnettuscano/repos",
"events_url": "https://api.github.com/users/linnettuscano/events{/privacy}",
"received_events_url": "https://api.github.com/users/linnettuscano/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T16:28:13
| 2025-04-04T22:25:16
| 2025-04-04T22:25:16
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37255",
"html_url": "https://github.com/huggingface/transformers/pull/37255",
"diff_url": "https://github.com/huggingface/transformers/pull/37255.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37255.patch",
"merged_at": "2025-04-04T22:25:16"
}
|
# What does this PR do?
This PR updates the OpenAI GPT model card documentation to better reflect recent improvements and clarifications.
## Before submitting
- [✔️ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ✔️] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [✔️] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Documentation: @stevhliu
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37255/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37255/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37254
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37254/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37254/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37254/events
|
https://github.com/huggingface/transformers/pull/37254
| 2,970,109,184
|
PR_kwDOCUB6oc6RSZzr
| 37,254
|
Expose blip2qformer
|
{
"login": "alex-jw-brooks",
"id": 10740300,
"node_id": "MDQ6VXNlcjEwNzQwMzAw",
"avatar_url": "https://avatars.githubusercontent.com/u/10740300?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alex-jw-brooks",
"html_url": "https://github.com/alex-jw-brooks",
"followers_url": "https://api.github.com/users/alex-jw-brooks/followers",
"following_url": "https://api.github.com/users/alex-jw-brooks/following{/other_user}",
"gists_url": "https://api.github.com/users/alex-jw-brooks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alex-jw-brooks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex-jw-brooks/subscriptions",
"organizations_url": "https://api.github.com/users/alex-jw-brooks/orgs",
"repos_url": "https://api.github.com/users/alex-jw-brooks/repos",
"events_url": "https://api.github.com/users/alex-jw-brooks/events{/privacy}",
"received_events_url": "https://api.github.com/users/alex-jw-brooks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T16:19:32
| 2025-04-08T10:04:33
| 2025-04-08T10:04:33
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37254",
"html_url": "https://github.com/huggingface/transformers/pull/37254",
"diff_url": "https://github.com/huggingface/transformers/pull/37254.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37254.patch",
"merged_at": "2025-04-08T10:04:33"
}
|
This PR exposes Blip2QFormer so that it can be created via `AutoModel.from_config`, which is needed for making the projector component in granite speech generic (discussed in [this PR](https://github.com/huggingface/transformers/pull/36801)).
Blip2QFormer is most of the way there already - this PR just:
- Handles AutoModel registration
- Adds docstrings / more information on forward inputs
Note that Blip2QFormer is already tested pretty extensively though the blip2 tests, and will also be tested through granite speech tests.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
@ArthurZucker can you please take a look?
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37254/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37254/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37253
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37253/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37253/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37253/events
|
https://github.com/huggingface/transformers/pull/37253
| 2,970,030,475
|
PR_kwDOCUB6oc6RSIjC
| 37,253
|
Update falcon mamba card
|
{
"login": "ricalanis",
"id": 3820751,
"node_id": "MDQ6VXNlcjM4MjA3NTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/3820751?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ricalanis",
"html_url": "https://github.com/ricalanis",
"followers_url": "https://api.github.com/users/ricalanis/followers",
"following_url": "https://api.github.com/users/ricalanis/following{/other_user}",
"gists_url": "https://api.github.com/users/ricalanis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ricalanis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ricalanis/subscriptions",
"organizations_url": "https://api.github.com/users/ricalanis/orgs",
"repos_url": "https://api.github.com/users/ricalanis/repos",
"events_url": "https://api.github.com/users/ricalanis/events{/privacy}",
"received_events_url": "https://api.github.com/users/ricalanis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T15:45:55
| 2025-04-07T17:12:44
| 2025-04-07T17:12:44
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37253",
"html_url": "https://github.com/huggingface/transformers/pull/37253",
"diff_url": "https://github.com/huggingface/transformers/pull/37253.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37253.patch",
"merged_at": "2025-04-07T17:12:44"
}
|
#36979
* Updated the FalconMamba model card
* Second contrib, thank you again for your patience <3
## Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [X] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [] Did you write any new necessary tests? NA
|
{
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37253/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37253/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37252
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37252/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37252/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37252/events
|
https://github.com/huggingface/transformers/pull/37252
| 2,969,951,810
|
PR_kwDOCUB6oc6RR3SD
| 37,252
|
[don't merge yet] Revert "Revert #37031"
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-03T15:15:40
| 2025-04-03T15:52:38
| null |
COLLABORATOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37252",
"html_url": "https://github.com/huggingface/transformers/pull/37252",
"diff_url": "https://github.com/huggingface/transformers/pull/37252.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37252.patch",
"merged_at": null
}
|
Reverts huggingface/transformers#37178
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37252/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37252/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37251
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37251/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37251/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37251/events
|
https://github.com/huggingface/transformers/pull/37251
| 2,969,630,708
|
PR_kwDOCUB6oc6RQwVs
| 37,251
|
Prefill chunking
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T13:39:34
| 2025-04-14T10:01:47
| 2025-04-14T10:01:47
|
MEMBER
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37251",
"html_url": "https://github.com/huggingface/transformers/pull/37251",
"diff_url": "https://github.com/huggingface/transformers/pull/37251.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37251.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37251/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37251/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37250
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37250/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37250/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37250/events
|
https://github.com/huggingface/transformers/issues/37250
| 2,969,518,508
|
I_kwDOCUB6oc6w_0Gs
| 37,250
|
FP8 tensors not saved correctly
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T13:02:55
| 2025-05-18T08:02:28
| 2025-05-18T08:02:28
|
MEMBER
| null | null | null | null |
I tried making a "mini-Deepseek" for testing but encountered some issues. This works fine:
```python
from transformers import AutoConfig, AutoModelForCausalLM
config = AutoConfig.from_pretrained("deepseek-ai/DeepSeek-V3-0324")
config.num_hidden_layers = 1
config.intermediate_size = 1024
model = AutoModelForCausalLM.from_config(config)
model.save_pretrained("test_save")
```
However, when I try to reload the model, I get the following:
```
>>> AutoModelForCausalLM.from_pretrained("test_save")
File "/home/matt/PycharmProjects/transformers/src/transformers/modeling_utils.py", line 806, in _load_state_dict_into_meta_model
not hf_quantizer.check_quantized_param(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/matt/PycharmProjects/transformers/src/transformers/quantizers/quantizer_finegrained_fp8.py", line 155, in check_quantized_param
raise ValueError("Expect quantized weights but got an unquantized weight")
ValueError: Expect quantized weights but got an unquantized weight
```
It seems like even though we support FP8 loading after #36828, we may not be saving it correctly? cc @kylesayrs
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37250/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37250/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37249
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37249/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37249/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37249/events
|
https://github.com/huggingface/transformers/pull/37249
| 2,969,493,964
|
PR_kwDOCUB6oc6RQSIr
| 37,249
|
[RoPE] abstract dynamic RoPE update under a decorator ✨
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T12:53:17
| 2025-04-04T13:27:31
| 2025-04-04T13:27:28
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37249",
"html_url": "https://github.com/huggingface/transformers/pull/37249",
"diff_url": "https://github.com/huggingface/transformers/pull/37249.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37249.patch",
"merged_at": "2025-04-04T13:27:28"
}
|
# What does this PR do?
✨ Part of the goals to streamline modeling files ✨
This PR abstracts the dynamic RoPE updates, "dynamic" and "longrope", under a decorator. Same capabilities for power users, simpler modeling file.
Review suggestion:
1. `modeling_rope_utils.py`
2. `llama`
3. `phi3` (longrope)
4. other models (which are either a direct copy or apply the same pattern)
Relevant tests: `py.test tests/models/ -k rope` (correct behavior of "dynamic" and "longrope" is tested with this command)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37249/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37249/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37248
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37248/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37248/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37248/events
|
https://github.com/huggingface/transformers/issues/37248
| 2,969,484,929
|
I_kwDOCUB6oc6w_r6B
| 37,248
|
Incorrect word timestamps and word repetitions with Whisper-Large-v3-turbo model
|
{
"login": "Asma-droid",
"id": 55100050,
"node_id": "MDQ6VXNlcjU1MTAwMDUw",
"avatar_url": "https://avatars.githubusercontent.com/u/55100050?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Asma-droid",
"html_url": "https://github.com/Asma-droid",
"followers_url": "https://api.github.com/users/Asma-droid/followers",
"following_url": "https://api.github.com/users/Asma-droid/following{/other_user}",
"gists_url": "https://api.github.com/users/Asma-droid/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Asma-droid/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Asma-droid/subscriptions",
"organizations_url": "https://api.github.com/users/Asma-droid/orgs",
"repos_url": "https://api.github.com/users/Asma-droid/repos",
"events_url": "https://api.github.com/users/Asma-droid/events{/privacy}",
"received_events_url": "https://api.github.com/users/Asma-droid/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
},
{
"id": 7377881103,
"node_id": "LA_kwDOCUB6oc8AAAABt8GIDw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Whisper",
"name": "Whisper",
"color": "83303E",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T12:49:45
| 2025-09-04T16:13:57
| 2025-09-04T16:13:57
|
NONE
| null | null | null | null |
### System Info
Hello,
Description:
I'm experiencing issues with the Whisper-Large-v3-turbo model when using it for transcription tasks with the Transformers library (version 4.38.3).
Problems:
Incorrect word timestamps: The timestamps generated by the model are not accurate. I've noticed that the timestamps are often incorrect.
<img width="140" alt="Image" src="https://github.com/user-attachments/assets/45042a47-d314-4046-887a-e5ee0d67e8fd" />
Word repetitions: I've also noticed that the model is repeating words in the transcription output. I've tried setting the repetition_penalty to 1.2, which has helped to reduce the repetitions, but the issue is not completely resolved.
Best regards
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Load the Whisper-Large-v3-turbo model using the Transformers library (version 4.38.3).
Use the model to transcribe an audio file.
Observe the word timestamps and transcription output.
### Expected behavior
Accurate word timestamps.
No word repetitions in the transcription output.
|
{
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37248/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37248/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37247
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37247/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37247/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37247/events
|
https://github.com/huggingface/transformers/pull/37247
| 2,969,370,768
|
PR_kwDOCUB6oc6RP2va
| 37,247
|
Adding links to ShieldGemma 2 technical report
|
{
"login": "RyanMullins",
"id": 868555,
"node_id": "MDQ6VXNlcjg2ODU1NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/868555?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RyanMullins",
"html_url": "https://github.com/RyanMullins",
"followers_url": "https://api.github.com/users/RyanMullins/followers",
"following_url": "https://api.github.com/users/RyanMullins/following{/other_user}",
"gists_url": "https://api.github.com/users/RyanMullins/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RyanMullins/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RyanMullins/subscriptions",
"organizations_url": "https://api.github.com/users/RyanMullins/orgs",
"repos_url": "https://api.github.com/users/RyanMullins/repos",
"events_url": "https://api.github.com/users/RyanMullins/events{/privacy}",
"received_events_url": "https://api.github.com/users/RyanMullins/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T12:07:27
| 2025-04-03T15:52:14
| 2025-04-03T15:26:30
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37247",
"html_url": "https://github.com/huggingface/transformers/pull/37247",
"diff_url": "https://github.com/huggingface/transformers/pull/37247.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37247.patch",
"merged_at": "2025-04-03T15:26:30"
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37247/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37247/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37246
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37246/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37246/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37246/events
|
https://github.com/huggingface/transformers/issues/37246
| 2,969,336,468
|
I_kwDOCUB6oc6w_HqU
| 37,246
|
Inconsistent results between torch and jax versions of DINOv2
|
{
"login": "MasterXiong",
"id": 20316847,
"node_id": "MDQ6VXNlcjIwMzE2ODQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/20316847?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MasterXiong",
"html_url": "https://github.com/MasterXiong",
"followers_url": "https://api.github.com/users/MasterXiong/followers",
"following_url": "https://api.github.com/users/MasterXiong/following{/other_user}",
"gists_url": "https://api.github.com/users/MasterXiong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MasterXiong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MasterXiong/subscriptions",
"organizations_url": "https://api.github.com/users/MasterXiong/orgs",
"repos_url": "https://api.github.com/users/MasterXiong/repos",
"events_url": "https://api.github.com/users/MasterXiong/events{/privacy}",
"received_events_url": "https://api.github.com/users/MasterXiong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 2934977194,
"node_id": "MDU6TGFiZWwyOTM0OTc3MTk0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flax",
"name": "Flax",
"color": "4862AD",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T11:56:34
| 2025-05-12T08:02:34
| 2025-05-12T08:02:34
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.50.0
- Platform: Linux-5.15.0-131-generic-x86_64-with-glibc2.31
- Python version: 3.10.16
- Huggingface_hub version: 0.29.0
- Safetensors version: 0.5.2
- Accelerate version: not installed
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): 2.15.1 (True)
- Flax version (CPU?/GPU?/TPU?): 0.8.1 (gpu)
- Jax version: 0.4.20
- JaxLib version: 0.4.20
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes
- GPU type: NVIDIA RTX A5000
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
from transformers import AutoImageProcessor, FlaxDinov2Model, Dinov2Model
from PIL import Image
import requests
import numpy as np
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
image_processor = AutoImageProcessor.from_pretrained("facebook/dinov2-base")
jax_inputs = image_processor(images=image, return_tensors="np")
# flax model
model = FlaxDinov2Model.from_pretrained("facebook/dinov2-base")
outputs = model(**jax_inputs)
jax_results = outputs.last_hidden_state
# torch model
import torch
model = Dinov2Model.from_pretrained("facebook/dinov2-base")
torch_inputs = image_processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**torch_inputs)
torch_results = outputs.last_hidden_state
print (np.abs(jax_results - torch_results.numpy()).max())
```
### Expected behavior
Hi,
I'm using the Flax version of DINOv2 and want to make sure that it returns consistent results as the torch version. So I run a simple test script as attached. However, I noticed that the token embeddings can have value difference larger than 6 by running it. I was wondering that if this is as expected due to numerical differences? Or is there something wrong in my code and the difference should not be so large? Thanks for your help!
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37246/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37246/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37245
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37245/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37245/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37245/events
|
https://github.com/huggingface/transformers/pull/37245
| 2,969,273,333
|
PR_kwDOCUB6oc6RPilX
| 37,245
|
Fix AST parsing when looking for remote code imports
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T11:29:45
| 2025-04-03T12:18:49
| 2025-04-03T12:00:52
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37245",
"html_url": "https://github.com/huggingface/transformers/pull/37245",
"diff_url": "https://github.com/huggingface/transformers/pull/37245.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37245.patch",
"merged_at": "2025-04-03T12:00:52"
}
|
The AST parsing code in `dynamic_module_utils` had a bug because it assumed that certain `Call` nodes had to be calling a named function and not something else like a method.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37245/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37245/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37244
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37244/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37244/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37244/events
|
https://github.com/huggingface/transformers/pull/37244
| 2,969,245,369
|
PR_kwDOCUB6oc6RPcVh
| 37,244
|
[CI] green llama tests
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T11:18:59
| 2025-04-03T13:15:53
| 2025-04-03T13:15:53
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37244",
"html_url": "https://github.com/huggingface/transformers/pull/37244",
"diff_url": "https://github.com/huggingface/transformers/pull/37244.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37244.patch",
"merged_at": "2025-04-03T13:15:53"
}
|
# What does this PR do?
Before start the work to refactor models, let's make the tests on our base model green 🤗 All tests on llama are green after this PR, except for the flex attention tests (which is a WIP feature)
Fixes:
- `batch_size` -> `max_batch_size` in `StaticCache` (#37007)
- `generate` + FA2 test -> don't pass attention masks with right-padding (FA2 doesn't support all attention mask patterns, and with generate the mask would have holes like `1 1 0 0 0 0 1` at generation time)
- models with compilation integration tests -> clear CUDA cache (see comment in `LlamaIntegrationTest.tearDown`)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37244/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37244/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37243
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37243/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37243/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37243/events
|
https://github.com/huggingface/transformers/pull/37243
| 2,969,157,106
|
PR_kwDOCUB6oc6RPJuj
| 37,243
|
Fix Flash Attention guard
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-03T10:48:51
| 2025-04-09T12:46:05
| null |
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37243",
"html_url": "https://github.com/huggingface/transformers/pull/37243",
"diff_url": "https://github.com/huggingface/transformers/pull/37243.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37243.patch",
"merged_at": null
}
|
# What does this PR do?
Transformers import crash if flash attention is installed but not compatible with current torch+cuda version, while we should catch this with `is_flash_attention_2_avaliable`
Fixes #37227
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37243/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37243/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37242
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37242/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37242/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37242/events
|
https://github.com/huggingface/transformers/issues/37242
| 2,969,093,786
|
I_kwDOCUB6oc6w-Maa
| 37,242
|
TypeError: 'NoneType' object cannot be interpreted as an integer
|
{
"login": "Key-NE",
"id": 121442231,
"node_id": "U_kgDOBz0Ptw",
"avatar_url": "https://avatars.githubusercontent.com/u/121442231?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Key-NE",
"html_url": "https://github.com/Key-NE",
"followers_url": "https://api.github.com/users/Key-NE/followers",
"following_url": "https://api.github.com/users/Key-NE/following{/other_user}",
"gists_url": "https://api.github.com/users/Key-NE/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Key-NE/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Key-NE/subscriptions",
"organizations_url": "https://api.github.com/users/Key-NE/orgs",
"repos_url": "https://api.github.com/users/Key-NE/repos",
"events_url": "https://api.github.com/users/Key-NE/events{/privacy}",
"received_events_url": "https://api.github.com/users/Key-NE/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T10:27:40
| 2025-05-12T08:02:36
| 2025-05-12T08:02:36
|
NONE
| null | null | null | null |
### System Info
from langchain_community.document_loaders import WebBaseLoader
from langchain_text_splitters import RecursiveCharacterTextSplitter
from langchain_chroma import Chroma
from langchain_ollama import OllamaEmbeddings
from langchain_ollama import ChatOllama
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
import os
# from langchain.document_loaders import TextLoader,Docx2txtLoader
from langchain_community.document_loaders import TextLoader, Docx2txtLoader
from PyPDF2 import PdfReader
from langchain_core.runnables import RunnablePassthrough
from langchain_community.vectorstores import FAISS
from transformers import AutoModelForCausalLM, AutoTokenizer,BitsAndBytesConfig
import torch
os.environ["http_proxy"] = "http://127.0.0.1:11434"
os.environ["https_proxy"] = "http://127.0.0.1:11434"
#加载pdf文件并返回文本段
def load_single_pdf(file_path):
pdf_reader = PdfReader(file_path)
if not pdf_reader:
return None
ret = ''
for i,page in enumerate(pdf_reader.pages):
txt = page.extractText()
if txt:
ret += txt
return ret
#文档加载器
def data_loader(directory_path=r'E:\llm\deepseek\document_hc'):
all_docs = []
for root, dirs, files in os.walk(directory_path):
for file in files:
file_path = os.path.join(root, file)
if file.endswith('.txt'):
loader = TextLoader(file_path, encoding='utf-8')
elif file.endswith('.docx'):
loader = Docx2txtLoader(file_path)
elif file.endswith('.pdf'):
loader = load_single_pdf(file_path)
else:
continue
docs = loader.load()
all_docs.extend(docs)
return all_docs
# 将文本拆分为 docs文档
def split_text(txt,chunk_size=1000,overlap=100,directory_path =r'E:\llm\deepseek\document_hc'):
if not txt:
return None
splitter = RecursiveCharacterTextSplitter(chunk_size = chunk_size, chunk_overlap = overlap)
docs = splitter.split_documents(data_loader(directory_path))
# docs = splitter.split_text(txt)
return docs
def create_embeddings():
embeddings = OllamaEmbeddings(model="bge-m3:567m")
return embeddings
def load_llm_ollama(model_path_name = "deepseek-r1:7b"):
chat_model = OllamaEmbeddings(model=model_path_name)
return chat_model
# 使用Embeddings嵌入模型将文档保存到向量知识库存储
def create_vector_store(docs, embeddings, store_path):
# vectorstore = Chroma.from_documents(documents=split, embedding=local_embeddings)
# vector_store = FAISS.from_texts(docs,embeddings)
vector_store = FAISS.from_documents(docs, embeddings)
vector_store.save_local(store_path)
return vector_store
# 从文件加载向量知识库
def load_vector_store(store_path,embeddings):
if os.path.exists(store_path):
vector_store = FAISS.load_local(
store_path,
embeddings = embeddings,
allow_dangerous_deserialization=True
)
return vector_store
else:
return None
# 加载或者创建向量知识库
def load_or_create_vector_store(store_path,doc_file_path):
embeddings = create_embeddings()
vector_store = load_vector_store(store_path,embeddings)
if not vector_store:
docs = split_text(doc_file_path)
vector_store = create_vector_store(docs,embeddings,store_path)
return vector_store
# 从vector store查询上下文
def query_vector_store(vector_store, query, k=4, relevance_threshold=0.8):
similar_docs = vector_store.similarity_search_with_relevance_scores(query, k=k)
related_docs = list(filter(lambda x: x[1] > relevance_threshold, similar_docs))
context = [doc[0].page_content for doc in related_docs]
return context
# Load llm(deepseek) and tokenizer
def load_llm(model_path,CUDA_Device):
quant_config = BitsAndBytesConfig(load_in_8bit=True)
model = AutoModelForCausalLM.from_pretrained(model_path,
device_map=CUDA_Device,
torch_dtype=torch.float16,
quantization_config=quant_config)
model = model.eval()
tokenizer = AutoTokenizer.from_pretrained(model_path,
use_fast=False)
tokenizer.pad_token = tokenizer.eos_token
return model, tokenizer
def ask(model, tokenizer, prompt, CUDA_Device,max_tokens=512 ):
terminators = [
tokenizer.eos_token_id,
tokenizer.convert_tokens_to_ids('<|eot_id|>')
]
input_ids = tokenizer([prompt],
return_tensors='pt',
add_special_tokens=False).input_ids.to(CUDA_Device)
generated_input = {
'input_ids': input_ids,
'max_new_tokens': max_tokens,
'do_sample': True,
'top_p': 0.95,
'temperature': 0.9,
'repetition_penalty': 1.1,
'eos_token_id': terminators,
'bos_token_id': tokenizer.bos_token_id,
'pad_token_id': tokenizer.pad_token_id
}
generated_ids = model.generate(**generated_input)
ans = tokenizer.decode(generated_ids[0], skip_special_token=True)
return ans
def main():
# 初始化
doc_file_path = r'E:\llm\deepseek\document_hc'
# store_path = './Data/Aquila.faiss'
store_path = r'E:\llm\deepseek\Data_vecstore\Aquila.faiss'
Embedding_Model = 'bge-m3:567m'
LLM_Model = r'E:\llm\deepseek\DeepSeek-R1-Distill-Qwen-1.5B'
# LLM_Model_path = r'E:\llm\deepseek\DeepSeek-R1-Distill-Qwen-7B'
CUDA_Device = 'cuda:0'
vector_store = load_or_create_vector_store(store_path, doc_file_path)
model, tokenizer = load_llm(LLM_Model,CUDA_Device)
while True:
qiz = input('请输入您的问题: ')
if qiz == 'bye' or qiz == 'exit':
print('Bye~')
break
# Query context from vector store based on question, and compose prompt
context = query_vector_store(vector_store, qiz, 6, 0.75)
if len(context) == 0:
# No satisfying context is found inside vector store
print('无法从保存的向量存储中找到限定的上下文,在没有上下文的情况下与LLM交谈')
prompt = f'请回答以下问题: \n{qiz}\n'
else:
context = '\n'.join(context)
prompt = f'基于以下上下文: \n{context}\n请回答以下问题: \n{qiz}\n'
ans = ask(model, tokenizer, prompt,CUDA_Device)[len(prompt):]
print(ans)
if __name__ == '__main__':
main()
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
from langchain_community.document_loaders import WebBaseLoader
from langchain_text_splitters import RecursiveCharacterTextSplitter
from langchain_chroma import Chroma
from langchain_ollama import OllamaEmbeddings
from langchain_ollama import ChatOllama
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
import os
# from langchain.document_loaders import TextLoader,Docx2txtLoader
from langchain_community.document_loaders import TextLoader, Docx2txtLoader
from PyPDF2 import PdfReader
from langchain_core.runnables import RunnablePassthrough
from langchain_community.vectorstores import FAISS
from transformers import AutoModelForCausalLM, AutoTokenizer,BitsAndBytesConfig
import torch
os.environ["http_proxy"] = "http://127.0.0.1:11434"
os.environ["https_proxy"] = "http://127.0.0.1:11434"
#加载pdf文件并返回文本段
def load_single_pdf(file_path):
pdf_reader = PdfReader(file_path)
if not pdf_reader:
return None
ret = ''
for i,page in enumerate(pdf_reader.pages):
txt = page.extractText()
if txt:
ret += txt
return ret
#文档加载器
def data_loader(directory_path=r'E:\llm\deepseek\document_hc'):
all_docs = []
for root, dirs, files in os.walk(directory_path):
for file in files:
file_path = os.path.join(root, file)
if file.endswith('.txt'):
loader = TextLoader(file_path, encoding='utf-8')
elif file.endswith('.docx'):
loader = Docx2txtLoader(file_path)
elif file.endswith('.pdf'):
loader = load_single_pdf(file_path)
else:
continue
docs = loader.load()
all_docs.extend(docs)
return all_docs
# 将文本拆分为 docs文档
def split_text(txt,chunk_size=1000,overlap=100,directory_path =r'E:\llm\deepseek\document_hc'):
if not txt:
return None
splitter = RecursiveCharacterTextSplitter(chunk_size = chunk_size, chunk_overlap = overlap)
docs = splitter.split_documents(data_loader(directory_path))
# docs = splitter.split_text(txt)
return docs
def create_embeddings():
embeddings = OllamaEmbeddings(model="bge-m3:567m")
return embeddings
def load_llm_ollama(model_path_name = "deepseek-r1:7b"):
chat_model = OllamaEmbeddings(model=model_path_name)
return chat_model
# 使用Embeddings嵌入模型将文档保存到向量知识库存储
def create_vector_store(docs, embeddings, store_path):
# vectorstore = Chroma.from_documents(documents=split, embedding=local_embeddings)
# vector_store = FAISS.from_texts(docs,embeddings)
vector_store = FAISS.from_documents(docs, embeddings)
vector_store.save_local(store_path)
return vector_store
# 从文件加载向量知识库
def load_vector_store(store_path,embeddings):
if os.path.exists(store_path):
vector_store = FAISS.load_local(
store_path,
embeddings = embeddings,
allow_dangerous_deserialization=True
)
return vector_store
else:
return None
# 加载或者创建向量知识库
def load_or_create_vector_store(store_path,doc_file_path):
embeddings = create_embeddings()
vector_store = load_vector_store(store_path,embeddings)
if not vector_store:
docs = split_text(doc_file_path)
vector_store = create_vector_store(docs,embeddings,store_path)
return vector_store
# 从vector store查询上下文
def query_vector_store(vector_store, query, k=4, relevance_threshold=0.8):
similar_docs = vector_store.similarity_search_with_relevance_scores(query, k=k)
related_docs = list(filter(lambda x: x[1] > relevance_threshold, similar_docs))
context = [doc[0].page_content for doc in related_docs]
return context
# Load llm(deepseek) and tokenizer
def load_llm(model_path,CUDA_Device):
quant_config = BitsAndBytesConfig(load_in_8bit=True)
model = AutoModelForCausalLM.from_pretrained(model_path,
device_map=CUDA_Device,
torch_dtype=torch.float16,
quantization_config=quant_config)
model = model.eval()
tokenizer = AutoTokenizer.from_pretrained(model_path,
use_fast=False)
tokenizer.pad_token = tokenizer.eos_token
return model, tokenizer
def ask(model, tokenizer, prompt, CUDA_Device,max_tokens=512 ):
terminators = [
tokenizer.eos_token_id,
tokenizer.convert_tokens_to_ids('<|eot_id|>')
]
input_ids = tokenizer([prompt],
return_tensors='pt',
add_special_tokens=False).input_ids.to(CUDA_Device)
generated_input = {
'input_ids': input_ids,
'max_new_tokens': max_tokens,
'do_sample': True,
'top_p': 0.95,
'temperature': 0.9,
'repetition_penalty': 1.1,
'eos_token_id': terminators,
'bos_token_id': tokenizer.bos_token_id,
'pad_token_id': tokenizer.pad_token_id
}
generated_ids = model.generate(**generated_input)
ans = tokenizer.decode(generated_ids[0], skip_special_token=True)
return ans
def main():
# 初始化
doc_file_path = r'E:\llm\deepseek\document_hc'
# store_path = './Data/Aquila.faiss'
store_path = r'E:\llm\deepseek\Data_vecstore\Aquila.faiss'
Embedding_Model = 'bge-m3:567m'
LLM_Model = r'E:\llm\deepseek\DeepSeek-R1-Distill-Qwen-1.5B'
# LLM_Model_path = r'E:\llm\deepseek\DeepSeek-R1-Distill-Qwen-7B'
CUDA_Device = 'cuda:0'
vector_store = load_or_create_vector_store(store_path, doc_file_path)
model, tokenizer = load_llm(LLM_Model,CUDA_Device)
while True:
qiz = input('请输入您的问题: ')
if qiz == 'bye' or qiz == 'exit':
print('Bye~')
break
# Query context from vector store based on question, and compose prompt
context = query_vector_store(vector_store, qiz, 6, 0.75)
if len(context) == 0:
# No satisfying context is found inside vector store
print('无法从保存的向量存储中找到限定的上下文,在没有上下文的情况下与LLM交谈')
prompt = f'请回答以下问题: \n{qiz}\n'
else:
context = '\n'.join(context)
prompt = f'基于以下上下文: \n{context}\n请回答以下问题: \n{qiz}\n'
ans = ask(model, tokenizer, prompt,CUDA_Device)[len(prompt):]
print(ans)
if __name__ == '__main__':
main()
### Expected behavior
Traceback (most recent call last):
File "E:\llm\deepseek\demo_test\langchain_0402.py", line 199, in <module>
main()
File "E:\llm\deepseek\demo_test\langchain_0402.py", line 195, in main
ans = ask(model, tokenizer, prompt,CUDA_Device)[len(prompt):]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\llm\deepseek\demo_test\langchain_0402.py", line 159, in ask
generated_ids = model.generate(**generated_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Program Files\anaconda3\envs\LLM\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\Program Files\anaconda3\envs\LLM\Lib\site-packages\transformers\generation\utils.py", line 2004, in generate
self._prepare_special_tokens(generation_config, kwargs_has_attention_mask, device=device)
File "D:\Program Files\anaconda3\envs\LLM\Lib\site-packages\transformers\generation\utils.py", line 1820, in _prepare_special_tokens
eos_token_tensor = _tensor_or_none(generation_config.eos_token_id, device=device)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Program Files\anaconda3\envs\LLM\Lib\site-packages\transformers\generation\utils.py", line 1817, in _tensor_or_none
return torch.tensor(token, device=device, dtype=torch.long)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'NoneType' object cannot be interpreted as an integer
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37242/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37242/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37241
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37241/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37241/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37241/events
|
https://github.com/huggingface/transformers/pull/37241
| 2,969,046,934
|
PR_kwDOCUB6oc6ROxhe
| 37,241
|
Use `raise from e` in `hub.py` utility
|
{
"login": "Wauplin",
"id": 11801849,
"node_id": "MDQ6VXNlcjExODAxODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/11801849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wauplin",
"html_url": "https://github.com/Wauplin",
"followers_url": "https://api.github.com/users/Wauplin/followers",
"following_url": "https://api.github.com/users/Wauplin/following{/other_user}",
"gists_url": "https://api.github.com/users/Wauplin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Wauplin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Wauplin/subscriptions",
"organizations_url": "https://api.github.com/users/Wauplin/orgs",
"repos_url": "https://api.github.com/users/Wauplin/repos",
"events_url": "https://api.github.com/users/Wauplin/events{/privacy}",
"received_events_url": "https://api.github.com/users/Wauplin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T10:15:12
| 2025-06-19T03:06:26
| 2025-06-19T03:06:25
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37241",
"html_url": "https://github.com/huggingface/transformers/pull/37241",
"diff_url": "https://github.com/huggingface/transformers/pull/37241.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37241.patch",
"merged_at": "2025-06-19T03:06:25"
}
|
Always better to `"raise from"` when raising from another exception (dunno why we missed this one, other `raise` are ok)
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37241/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37241/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37240
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37240/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37240/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37240/events
|
https://github.com/huggingface/transformers/issues/37240
| 2,968,806,117
|
I_kwDOCUB6oc6w9GLl
| 37,240
|
ed_video = input_tokens.index(video_token_id, st) ValueError: 151656 is not in list
|
{
"login": "yaomingzhang",
"id": 84617341,
"node_id": "MDQ6VXNlcjg0NjE3MzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/84617341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaomingzhang",
"html_url": "https://github.com/yaomingzhang",
"followers_url": "https://api.github.com/users/yaomingzhang/followers",
"following_url": "https://api.github.com/users/yaomingzhang/following{/other_user}",
"gists_url": "https://api.github.com/users/yaomingzhang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaomingzhang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaomingzhang/subscriptions",
"organizations_url": "https://api.github.com/users/yaomingzhang/orgs",
"repos_url": "https://api.github.com/users/yaomingzhang/repos",
"events_url": "https://api.github.com/users/yaomingzhang/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaomingzhang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T08:55:07
| 2025-05-12T08:02:37
| 2025-05-12T08:02:37
|
NONE
| null | null | null | null |
### System Info
During the fine-tuning process of Qwen-OMNI
### Who can help?
During the fine-tuning process of Qwen-OMNI,
using transformer version: transformers-f742a644ca32e65758c3adb36225aef1731bd2a8,
I encountered embedding video issues:

After switching to transformers-3a1ead0aabed473eafe527915eea8c197d424356,
the inference process no longer throws errors. However, during fine-tuning, I encountered:

help me
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
help me
### Expected behavior
no error
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37240/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37240/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37239
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37239/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37239/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37239/events
|
https://github.com/huggingface/transformers/issues/37239
| 2,968,736,681
|
I_kwDOCUB6oc6w81Op
| 37,239
|
opencv imshow stuck forever when importing transformer
|
{
"login": "leemengwei",
"id": 17986725,
"node_id": "MDQ6VXNlcjE3OTg2NzI1",
"avatar_url": "https://avatars.githubusercontent.com/u/17986725?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leemengwei",
"html_url": "https://github.com/leemengwei",
"followers_url": "https://api.github.com/users/leemengwei/followers",
"following_url": "https://api.github.com/users/leemengwei/following{/other_user}",
"gists_url": "https://api.github.com/users/leemengwei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leemengwei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leemengwei/subscriptions",
"organizations_url": "https://api.github.com/users/leemengwei/orgs",
"repos_url": "https://api.github.com/users/leemengwei/repos",
"events_url": "https://api.github.com/users/leemengwei/events{/privacy}",
"received_events_url": "https://api.github.com/users/leemengwei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T08:26:53
| 2025-06-20T01:10:19
| 2025-05-12T08:02:39
|
NONE
| null | null | null | null |
### System Info
opencv imshow stuck forever when importing transformer

### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
In local laptop ubuntu2204, python39,
from transformers import AutoImageProcessor, AutoModel
import cv2
cv2.namedWindow("Window", cv2.WINDOW_GUI_NORMAL)
STUCK forever!!
### Expected behavior
A normal window pop out
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37239/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37239/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37238
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37238/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37238/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37238/events
|
https://github.com/huggingface/transformers/issues/37238
| 2,968,729,779
|
I_kwDOCUB6oc6w8ziz
| 37,238
|
transformers 4.50 does not work with pytorch 2.0
|
{
"login": "koen-dejonghe",
"id": 2901242,
"node_id": "MDQ6VXNlcjI5MDEyNDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/2901242?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/koen-dejonghe",
"html_url": "https://github.com/koen-dejonghe",
"followers_url": "https://api.github.com/users/koen-dejonghe/followers",
"following_url": "https://api.github.com/users/koen-dejonghe/following{/other_user}",
"gists_url": "https://api.github.com/users/koen-dejonghe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/koen-dejonghe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/koen-dejonghe/subscriptions",
"organizations_url": "https://api.github.com/users/koen-dejonghe/orgs",
"repos_url": "https://api.github.com/users/koen-dejonghe/repos",
"events_url": "https://api.github.com/users/koen-dejonghe/events{/privacy}",
"received_events_url": "https://api.github.com/users/koen-dejonghe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T08:23:59
| 2025-04-07T13:19:49
| 2025-04-03T12:12:02
|
NONE
| null | null | null | null |
### System Info
```
pip install torch==2.0.1 transformers==4.50.3
python <<-EOF
from transformers.integrations import CodeCarbonCallback
EOF
```
results in
```
RuntimeError: Failed to import transformers.integrations.integration_utils because of the following error (look up to see its traceback):
Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
module 'torch' has no attribute 'compiler'
```
I cannot run `transformers-cli env` for the same reason.
transformers 4.49 does not have this issue.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```
pip install torch==2.0.1 transformers==4.50.3
python <<-EOF
from transformers.integrations import CodeCarbonCallback
EOF
```
### Expected behavior
it should not give an error.
transformers 4.49 does not have this issue.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37238/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/huggingface/transformers/issues/37238/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37237
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37237/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37237/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37237/events
|
https://github.com/huggingface/transformers/pull/37237
| 2,968,323,963
|
PR_kwDOCUB6oc6RMWgA
| 37,237
|
Fix deprecated PT functions
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T04:51:24
| 2025-04-04T11:33:45
| 2025-04-04T11:31:11
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37237",
"html_url": "https://github.com/huggingface/transformers/pull/37237",
"diff_url": "https://github.com/huggingface/transformers/pull/37237.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37237.patch",
"merged_at": "2025-04-04T11:31:11"
}
|
These functions are deprecated in PT 2.0
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37237/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37237/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37236
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37236/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37236/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37236/events
|
https://github.com/huggingface/transformers/pull/37236
| 2,968,316,349
|
PR_kwDOCUB6oc6RMUxj
| 37,236
|
Torchfix2
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T04:47:33
| 2025-04-03T04:49:38
| 2025-04-03T04:49:35
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37236",
"html_url": "https://github.com/huggingface/transformers/pull/37236",
"diff_url": "https://github.com/huggingface/transformers/pull/37236.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37236.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37236/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37236/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37235
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37235/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37235/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37235/events
|
https://github.com/huggingface/transformers/issues/37235
| 2,968,294,814
|
I_kwDOCUB6oc6w7JWe
| 37,235
|
transformers has no attribute TFFlorence2ForConditionalGeneration
|
{
"login": "wuchaotao",
"id": 160714097,
"node_id": "U_kgDOCZRNcQ",
"avatar_url": "https://avatars.githubusercontent.com/u/160714097?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wuchaotao",
"html_url": "https://github.com/wuchaotao",
"followers_url": "https://api.github.com/users/wuchaotao/followers",
"following_url": "https://api.github.com/users/wuchaotao/following{/other_user}",
"gists_url": "https://api.github.com/users/wuchaotao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wuchaotao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wuchaotao/subscriptions",
"organizations_url": "https://api.github.com/users/wuchaotao/orgs",
"repos_url": "https://api.github.com/users/wuchaotao/repos",
"events_url": "https://api.github.com/users/wuchaotao/events{/privacy}",
"received_events_url": "https://api.github.com/users/wuchaotao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 1834054694,
"node_id": "MDU6TGFiZWwxODM0MDU0Njk0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/TensorFlow",
"name": "TensorFlow",
"color": "FF6F00",
"default": false,
"description": "Anything TensorFlow"
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T04:30:43
| 2025-05-15T08:03:08
| 2025-05-15T08:03:08
|
NONE
| null | null | null | null |
How can I solve the issue: transformers has no attribute TFFlorence2ForConditionalGeneration
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37235/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37235/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37234
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37234/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37234/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37234/events
|
https://github.com/huggingface/transformers/pull/37234
| 2,968,259,448
|
PR_kwDOCUB6oc6RMIkG
| 37,234
|
Remove old code for PyTorch, Accelerator and tokenizers
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T03:57:47
| 2025-04-11T00:54:08
| 2025-04-10T18:54:21
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37234",
"html_url": "https://github.com/huggingface/transformers/pull/37234",
"diff_url": "https://github.com/huggingface/transformers/pull/37234.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37234.patch",
"merged_at": "2025-04-10T18:54:21"
}
|
# What does this PR do?
Remove outdated conditions and comments for PyTorch and Accelerator.
Specifically, some code for PyTorch < 2.1 has been found and removed. As a result, the functions ``is_torch_bf16_cpu_available``,`is_torch_fx_available`, `is_torchdynamo_available` and `is_torch_compile_available` are now equivalent to `is_torch_available`. Some tests are further simplified using this fact.
There is one change regarding to old Accelerator code.
And old tokenizers version check is removed.
|
{
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37234/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37234/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37233
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37233/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37233/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37233/events
|
https://github.com/huggingface/transformers/pull/37233
| 2,968,123,315
|
PR_kwDOCUB6oc6RLqEe
| 37,233
|
Enable pylint on RUFF
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T02:58:19
| 2025-04-03T13:59:16
| 2025-04-03T13:59:16
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37233",
"html_url": "https://github.com/huggingface/transformers/pull/37233",
"diff_url": "https://github.com/huggingface/transformers/pull/37233.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37233.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37233/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37233/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37232
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37232/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37232/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37232/events
|
https://github.com/huggingface/transformers/pull/37232
| 2,968,067,971
|
PR_kwDOCUB6oc6RLei_
| 37,232
|
Remove deprecated size_divisibility
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T02:34:57
| 2025-09-01T05:02:22
| 2025-09-01T05:02:17
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37232",
"html_url": "https://github.com/huggingface/transformers/pull/37232",
"diff_url": "https://github.com/huggingface/transformers/pull/37232.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37232.patch",
"merged_at": null
}
|
# What does this PR do?
Remove deprecated size_divisibility.
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37232/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37232/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37231
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37231/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37231/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37231/events
|
https://github.com/huggingface/transformers/pull/37231
| 2,968,029,097
|
PR_kwDOCUB6oc6RLWNO
| 37,231
|
Update model-card for Autofomer
|
{
"login": "AnupKulkarniVK",
"id": 154582627,
"node_id": "U_kgDOCTa-Yw",
"avatar_url": "https://avatars.githubusercontent.com/u/154582627?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AnupKulkarniVK",
"html_url": "https://github.com/AnupKulkarniVK",
"followers_url": "https://api.github.com/users/AnupKulkarniVK/followers",
"following_url": "https://api.github.com/users/AnupKulkarniVK/following{/other_user}",
"gists_url": "https://api.github.com/users/AnupKulkarniVK/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AnupKulkarniVK/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AnupKulkarniVK/subscriptions",
"organizations_url": "https://api.github.com/users/AnupKulkarniVK/orgs",
"repos_url": "https://api.github.com/users/AnupKulkarniVK/repos",
"events_url": "https://api.github.com/users/AnupKulkarniVK/events{/privacy}",
"received_events_url": "https://api.github.com/users/AnupKulkarniVK/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-03T02:20:25
| 2025-04-18T15:04:17
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37231",
"html_url": "https://github.com/huggingface/transformers/pull/37231",
"diff_url": "https://github.com/huggingface/transformers/pull/37231.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37231.patch",
"merged_at": null
}
|
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37231/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37231/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37230
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37230/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37230/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37230/events
|
https://github.com/huggingface/transformers/pull/37230
| 2,967,966,321
|
PR_kwDOCUB6oc6RLJKs
| 37,230
|
Remove deprecated reduce_labels
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T01:42:42
| 2025-08-31T14:50:32
| 2025-08-31T14:50:26
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37230",
"html_url": "https://github.com/huggingface/transformers/pull/37230",
"diff_url": "https://github.com/huggingface/transformers/pull/37230.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37230.patch",
"merged_at": null
}
|
# What does this PR do?
Remove deprecated ``reduce_labels`` from source code and tests.
|
{
"login": "cyyever",
"id": 17618148,
"node_id": "MDQ6VXNlcjE3NjE4MTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/17618148?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cyyever",
"html_url": "https://github.com/cyyever",
"followers_url": "https://api.github.com/users/cyyever/followers",
"following_url": "https://api.github.com/users/cyyever/following{/other_user}",
"gists_url": "https://api.github.com/users/cyyever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cyyever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cyyever/subscriptions",
"organizations_url": "https://api.github.com/users/cyyever/orgs",
"repos_url": "https://api.github.com/users/cyyever/repos",
"events_url": "https://api.github.com/users/cyyever/events{/privacy}",
"received_events_url": "https://api.github.com/users/cyyever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37230/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37230/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37229
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37229/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37229/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37229/events
|
https://github.com/huggingface/transformers/pull/37229
| 2,967,939,989
|
PR_kwDOCUB6oc6RLDPk
| 37,229
|
Fix static cache export
|
{
"login": "guangy10",
"id": 42389959,
"node_id": "MDQ6VXNlcjQyMzg5OTU5",
"avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guangy10",
"html_url": "https://github.com/guangy10",
"followers_url": "https://api.github.com/users/guangy10/followers",
"following_url": "https://api.github.com/users/guangy10/following{/other_user}",
"gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guangy10/subscriptions",
"organizations_url": "https://api.github.com/users/guangy10/orgs",
"repos_url": "https://api.github.com/users/guangy10/repos",
"events_url": "https://api.github.com/users/guangy10/events{/privacy}",
"received_events_url": "https://api.github.com/users/guangy10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T01:29:13
| 2025-04-03T05:05:57
| 2025-04-03T05:05:57
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37229",
"html_url": "https://github.com/huggingface/transformers/pull/37229",
"diff_url": "https://github.com/huggingface/transformers/pull/37229.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37229.patch",
"merged_at": "2025-04-03T05:05:57"
}
|
# What does this PR do?
Fixed a test regression due to old arg deprecation.
```
RUN_SLOW=1 pytest tests/utils/test_cache_utils.py -k test_static_cache_exportability
```
## Before submitting
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @qubvel
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37229/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37229/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37228
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37228/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37228/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37228/events
|
https://github.com/huggingface/transformers/pull/37228
| 2,967,930,810
|
PR_kwDOCUB6oc6RLBLV
| 37,228
|
Updated model card for Qwen2 (#37192)
|
{
"login": "guangy10",
"id": 42389959,
"node_id": "MDQ6VXNlcjQyMzg5OTU5",
"avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guangy10",
"html_url": "https://github.com/guangy10",
"followers_url": "https://api.github.com/users/guangy10/followers",
"following_url": "https://api.github.com/users/guangy10/following{/other_user}",
"gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guangy10/subscriptions",
"organizations_url": "https://api.github.com/users/guangy10/orgs",
"repos_url": "https://api.github.com/users/guangy10/repos",
"events_url": "https://api.github.com/users/guangy10/events{/privacy}",
"received_events_url": "https://api.github.com/users/guangy10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T01:24:27
| 2025-04-03T01:24:40
| 2025-04-03T01:24:40
|
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37228",
"html_url": "https://github.com/huggingface/transformers/pull/37228",
"diff_url": "https://github.com/huggingface/transformers/pull/37228.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37228.patch",
"merged_at": null
}
|
* Update qwen2.md
* Update qwen2.md
* Update qwen2.md
* Update qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
* Update docs/source/en/model_doc/qwen2.md
---------
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
|
{
"login": "guangy10",
"id": 42389959,
"node_id": "MDQ6VXNlcjQyMzg5OTU5",
"avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guangy10",
"html_url": "https://github.com/guangy10",
"followers_url": "https://api.github.com/users/guangy10/followers",
"following_url": "https://api.github.com/users/guangy10/following{/other_user}",
"gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guangy10/subscriptions",
"organizations_url": "https://api.github.com/users/guangy10/orgs",
"repos_url": "https://api.github.com/users/guangy10/repos",
"events_url": "https://api.github.com/users/guangy10/events{/privacy}",
"received_events_url": "https://api.github.com/users/guangy10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37228/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37228/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37227
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37227/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37227/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37227/events
|
https://github.com/huggingface/transformers/issues/37227
| 2,967,919,199
|
I_kwDOCUB6oc6w5tpf
| 37,227
|
Why does `transformers` load FA2 when it's not asked to do so?
|
{
"login": "sfc-gh-sbekman",
"id": 196988264,
"node_id": "U_kgDOC73NaA",
"avatar_url": "https://avatars.githubusercontent.com/u/196988264?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sfc-gh-sbekman",
"html_url": "https://github.com/sfc-gh-sbekman",
"followers_url": "https://api.github.com/users/sfc-gh-sbekman/followers",
"following_url": "https://api.github.com/users/sfc-gh-sbekman/following{/other_user}",
"gists_url": "https://api.github.com/users/sfc-gh-sbekman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sfc-gh-sbekman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sfc-gh-sbekman/subscriptions",
"organizations_url": "https://api.github.com/users/sfc-gh-sbekman/orgs",
"repos_url": "https://api.github.com/users/sfc-gh-sbekman/repos",
"events_url": "https://api.github.com/users/sfc-gh-sbekman/events{/privacy}",
"received_events_url": "https://api.github.com/users/sfc-gh-sbekman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-03T01:18:07
| 2025-05-11T08:02:59
| 2025-05-11T08:02:59
|
CONTRIBUTOR
| null | null | null | null |
Why does `transformers` load FA2 when it's not asked to do so?
```
$ python -c "import transformers.modeling_utils"
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/code/users/stas/github/transformers/src/transformers/modeling_utils.py", line 55, in <module>
from .integrations.flash_attention import flash_attention_forward
File "/code/users/stas/github/transformers/src/transformers/integrations/flash_attention.py", line 5, in <module>
from ..modeling_flash_attention_utils import _flash_attention_forward
File "/code/users/stas/github/transformers/src/transformers/modeling_flash_attention_utils.py", line 29, in <module>
from flash_attn.bert_padding import index_first_axis, pad_input, unpad_input # noqa
File "/usr/local/lib/python3.10/dist-packages/flash_attn/__init__.py", line 3, in <module>
from flash_attn.flash_attn_interface import (
File "/usr/local/lib/python3.10/dist-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
import flash_attn_2_cuda as flash_attn_cuda
ImportError: /usr/local/lib/python3.10/dist-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationESs
```
Some years back Sylvain recoded everything to be loaded in a lazy way, so that only components that are going to be used will be loaded and not cause an unnecessary memory and time overhead loading things that will not be used.
I'm aware that the particular problem in the traceback I shared happens because I have upgraded pytorch and the FA2 library is now incompatible. I know how to fix that, but this has just showed me that FA2 is getting loaded when it shouldn't, IMHO.
Thank you.
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37227/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37227/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37226
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37226/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37226/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37226/events
|
https://github.com/huggingface/transformers/pull/37226
| 2,967,784,160
|
PR_kwDOCUB6oc6RKg7-
| 37,226
|
[RFC] Fix Gemma 3 FP16 with activation scaling
|
{
"login": "gau-nernst",
"id": 26946864,
"node_id": "MDQ6VXNlcjI2OTQ2ODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/26946864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gau-nernst",
"html_url": "https://github.com/gau-nernst",
"followers_url": "https://api.github.com/users/gau-nernst/followers",
"following_url": "https://api.github.com/users/gau-nernst/following{/other_user}",
"gists_url": "https://api.github.com/users/gau-nernst/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gau-nernst/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gau-nernst/subscriptions",
"organizations_url": "https://api.github.com/users/gau-nernst/orgs",
"repos_url": "https://api.github.com/users/gau-nernst/repos",
"events_url": "https://api.github.com/users/gau-nernst/events{/privacy}",
"received_events_url": "https://api.github.com/users/gau-nernst/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-02T23:37:23
| 2025-07-16T13:31:47
| null |
CONTRIBUTOR
| null | null | true
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37226",
"html_url": "https://github.com/huggingface/transformers/pull/37226",
"diff_url": "https://github.com/huggingface/transformers/pull/37226.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37226.patch",
"merged_at": null
}
|
# What does this PR do?
## The problem
Gemma 3 doesn't work with FP16 inference. The cause of it seems to be
1. Embedding output is scaled by sqrt(hidden_size) -> large activations to the transfromer trunk
https://github.com/huggingface/transformers/blob/199d7adf1037e1e3a0a7a2ceb155e468abef5e26/src/transformers/models/gemma3/modular_gemma3.py#L575-L578
2. Hence, the weight of post norms are huge, to counteract the huge activations in the identity stream
https://github.com/huggingface/transformers/blob/199d7adf1037e1e3a0a7a2ceb155e468abef5e26/src/transformers/models/gemma3/modular_gemma3.py#L524-L528

This makes activations go beyond the range of FP16 (~65k)
## The fix
We want to scale the "main" activations down, roughly by sqrt(hidden_size). To do this, we only need to scale the 2 following:
- Embedding layer
- Weight of `post_xx_layernorm`
This is because any scaling before `pre_xx_layernorm` will not change the math due to normalization. We need to scale the weight of `post_xx_layernorm` so that it is consistent with input to `pre_xx_layernorm`.
The final norm `model.norm` will remove the scaling, before LM head.
## Sanity check
```python
import torch
from transformers import pipeline, Gemma3ForCausalLM, AutoTokenizer
model_id = "google/gemma-3-4b-it"
model = Gemma3ForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.float16,
activation_scale=1/64,
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, device="cuda")
messages = [
{"role": "user", "content": "Write a poem on Hugging Face, the company"},
]
print(pipe(messages, max_new_tokens=100))
```
- BF16, no activation scale (default to 1) -> normal
- FP16, no activation scale (default to 1) -> CUDA error due to inf/nan
- FP16, activation scale=1/64 -> normal
Thank you to X users for drawing my attention to this and coming up with the fix: https://x.com/SeunghyunSEO7/status/1907350826940805266
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
- Feel free to test with larger models. I tested with 4B and 12B. For larger model, you may need to use smaller activation scale (1 / sqrt(hidden_size))
- This is an RFC since you may want a different design to introduce scaling
- Technically the scaling to `post_xx_norm` can be absorbed into the weight, but it's risky, since it needs to modify weight loading code. And things like `load_state_dict()` will not work correctly (or whatever method people might use to load weights)
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface and @SunMarc
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37226/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37226/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37225
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37225/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37225/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37225/events
|
https://github.com/huggingface/transformers/pull/37225
| 2,967,741,113
|
PR_kwDOCUB6oc6RKXSy
| 37,225
|
Adding MLPSpeculator support for assisted generation
|
{
"login": "sahilsuneja1",
"id": 6835847,
"node_id": "MDQ6VXNlcjY4MzU4NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6835847?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sahilsuneja1",
"html_url": "https://github.com/sahilsuneja1",
"followers_url": "https://api.github.com/users/sahilsuneja1/followers",
"following_url": "https://api.github.com/users/sahilsuneja1/following{/other_user}",
"gists_url": "https://api.github.com/users/sahilsuneja1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sahilsuneja1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sahilsuneja1/subscriptions",
"organizations_url": "https://api.github.com/users/sahilsuneja1/orgs",
"repos_url": "https://api.github.com/users/sahilsuneja1/repos",
"events_url": "https://api.github.com/users/sahilsuneja1/events{/privacy}",
"received_events_url": "https://api.github.com/users/sahilsuneja1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null |
[] | 2025-04-02T23:09:21
| 2025-04-03T14:31:03
| null |
NONE
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37225",
"html_url": "https://github.com/huggingface/transformers/pull/37225",
"diff_url": "https://github.com/huggingface/transformers/pull/37225.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37225.patch",
"merged_at": null
}
|
# What does this PR do?
This PR adds support to use [MLPSpeculator](https://pytorch.org/blog/hitchhikers-guide-speculative-decoding/) models for assisted generation, similar to it's support in [TGI](https://github.com/huggingface/text-generation-inference/pull/1865) and [vLLM](https://github.com/vllm-project/vllm/pull/4947)
Model code [originally](https://github.com/foundation-model-stack/fms-extras/blob/main/fms_extras/models/speculator.py) authored by Davis Wertheimer @daviswer
List of already existing speculators [here](https://huggingface.co/collections/ibm-ai-platform/speculators-66a1b6838f0d2327e0a3a8c3) and [here](https://huggingface.co/collections/ibm-granite/granite-speculators-664b97a44ddc5640e8cd73ac)
Training recipes new speculators [here](https://github.com/foundation-model-stack/fms-fsdp/tree/main/speculator) and [here](https://github.com/foundation-model-stack/fms-fsdp/tree/main/scripts)
Usage example:
```
import torch
import time
from transformers import AutoModelForCausalLM, AutoTokenizer, set_seed, MLPSpeculatorPreTrainedModel
def compare_assisted_generation(prompts, checkpoint, assistant_checkpoint):
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
inputs = tokenizer(prompts, return_tensors="pt").to(device=device)
model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device=device, dtype=torch.bfloat16)
assistant_model = MLPSpeculatorPreTrainedModel.from_pretrained(assistant_checkpoint).to(device=device, dtype=torch.bfloat16)
model.eval()
assistant_model.eval()
if model.generation_config.pad_token_id is None:
model.generation_config.pad_token_id = model.generation_config.eos_token_id
generate_kwargs = {
"do_sample":False,
"temperature":None,
"max_new_tokens":50,
"output_hidden_states":True,
}
# warmup
for _ in range(0,2):
model.generate(**inputs, **generate_kwargs)
model.generate(**inputs, assistant_model=assistant_model, **generate_kwargs)
start_time = time.time()
outputs = model.generate(**inputs, **generate_kwargs)
end_time = time.time()
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
print(f"Generation without assistant; Time taken: {end_time-start_time} seconds")
start_time = time.time()
outputs = model.generate(**inputs, assistant_model=assistant_model, **generate_kwargs)
end_time = time.time()
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
print(f"Generation with assistant; Time taken: {end_time-start_time} seconds")
torch.set_grad_enabled(False)
prompt = "Alice and Bob"
checkpoint = "meta-llama/Meta-Llama-3-8B-Instruct"
speculator_checkpoint = "ibm-ai-platform/llama3-8b-accelerator"
compare_assisted_generation(prompt, checkpoint, speculator_checkpoint)
```
Output from the above example on A100:
```
['Alice and Bob are two friends who are trying to solve a puzzle. They are given a set of numbers, and they need to find the sum of the numbers that are multiples of 3 or 5.\n\nHere is the set of numbers: 1, ']
Generation without assistant; Time taken: 1.150806188583374 seconds
['Alice and Bob are two friends who are trying to solve a puzzle. They are given a set of numbers, and they need to find the sum of the numbers that are multiples of 3 or 5.\n\nHere is the set of numbers: 1, ']
Generation with assistant; Time taken: 0.6626832485198975 seconds
```
## Who can review?
@gante
| null |
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37225/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37225/timeline
| null | null | null | null | true
| false
|
https://api.github.com/repos/huggingface/transformers/issues/37224
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37224/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37224/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37224/events
|
https://github.com/huggingface/transformers/pull/37224
| 2,967,711,997
|
PR_kwDOCUB6oc6RKQ_g
| 37,224
|
Add image classifier donut & update loss calculation for all swins
|
{
"login": "eljandoubi",
"id": 78537694,
"node_id": "MDQ6VXNlcjc4NTM3Njk0",
"avatar_url": "https://avatars.githubusercontent.com/u/78537694?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eljandoubi",
"html_url": "https://github.com/eljandoubi",
"followers_url": "https://api.github.com/users/eljandoubi/followers",
"following_url": "https://api.github.com/users/eljandoubi/following{/other_user}",
"gists_url": "https://api.github.com/users/eljandoubi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eljandoubi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eljandoubi/subscriptions",
"organizations_url": "https://api.github.com/users/eljandoubi/orgs",
"repos_url": "https://api.github.com/users/eljandoubi/repos",
"events_url": "https://api.github.com/users/eljandoubi/events{/privacy}",
"received_events_url": "https://api.github.com/users/eljandoubi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T22:47:36
| 2025-04-10T13:00:43
| 2025-04-10T13:00:42
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37224",
"html_url": "https://github.com/huggingface/transformers/pull/37224",
"diff_url": "https://github.com/huggingface/transformers/pull/37224.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37224.patch",
"merged_at": "2025-04-10T13:00:42"
}
|
# What does this PR do?
- Add classifier head to Donut
- Add image classifier loss to `LOSS_MAPPING`
- Update classifier loss for all swin models
Models:
- vision models: @amyeroberts, @qubvel
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37224/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37224/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37223
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37223/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37223/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37223/events
|
https://github.com/huggingface/transformers/pull/37223
| 2,967,659,453
|
PR_kwDOCUB6oc6RKFUR
| 37,223
|
Introduce GradientCheckpointingLayer
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T22:14:16
| 2025-05-06T01:00:39
| 2025-04-22T10:33:32
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37223",
"html_url": "https://github.com/huggingface/transformers/pull/37223",
"diff_url": "https://github.com/huggingface/transformers/pull/37223.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37223.patch",
"merged_at": "2025-04-22T10:33:32"
}
|
# What does this PR do?
A super minimal abstraction for a layer with gradient checkpointing that keeps the logic for enabling and disabling gradient checkpointing within PreTrainedModel for backward compatibility. It allows for a gradual rollout of the feature by supporting both checkpointing mechanisms: with a the current wrap of `_gradient_checkpointing_func` and using inheritance from `GradientCheckpointingLayer`.
I've applied this to Llama, but it's just a PoC for the discussion. Perhaps it's better to start with another less popular model that has fewer dependent models to see how it goes and check if it can be breaking for the hub custom code
## Who can review?
|
{
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37223/reactions",
"total_count": 5,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 5,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37223/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37222
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37222/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37222/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37222/events
|
https://github.com/huggingface/transformers/issues/37222
| 2,967,609,209
|
I_kwDOCUB6oc6w4h95
| 37,222
|
Qwen fails ungracefully when images are truncated
|
{
"login": "gbarello-uipath",
"id": 48561156,
"node_id": "MDQ6VXNlcjQ4NTYxMTU2",
"avatar_url": "https://avatars.githubusercontent.com/u/48561156?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gbarello-uipath",
"html_url": "https://github.com/gbarello-uipath",
"followers_url": "https://api.github.com/users/gbarello-uipath/followers",
"following_url": "https://api.github.com/users/gbarello-uipath/following{/other_user}",
"gists_url": "https://api.github.com/users/gbarello-uipath/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gbarello-uipath/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gbarello-uipath/subscriptions",
"organizations_url": "https://api.github.com/users/gbarello-uipath/orgs",
"repos_url": "https://api.github.com/users/gbarello-uipath/repos",
"events_url": "https://api.github.com/users/gbarello-uipath/events{/privacy}",
"received_events_url": "https://api.github.com/users/gbarello-uipath/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null |
[] | 2025-04-02T21:39:33
| 2025-04-16T18:49:21
| 2025-04-16T18:49:21
|
CONTRIBUTOR
| null | null | null | null |
### System Info
- `transformers` version: 4.49.0
- Platform: Linux-6.8.0-1025-gcp-x86_64-with-glibc2.39
- Python version: 3.11.10
- Huggingface_hub version: 0.29.3
- Safetensors version: 0.5.3
- Accelerate version: 0.34.2
- Accelerate config: - compute_environment: LOCAL_MACHINE
- distributed_type: MULTI_GPU
- mixed_precision: no
- use_cpu: False
- debug: False
- num_processes: 8
- machine_rank: 0
- num_machines: 1
- gpu_ids: all
- rdzv_backend: static
- same_network: True
- main_training_function: main
- enable_cpu_affinity: False
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
@qubvel
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
The following fails when "forward" is called. If you increase the MAX_LENGTH to 30 it succeeds
```
import torch
from transformers import AutoProcessor, Qwen2VLForConditionalGeneration
from PIL import Image
device = "cuda:0"
MAX_LENGTH = 15
# Load model and tokenizer
model_name = "Qwen/Qwen2-VL-7B"
model = Qwen2VLForConditionalGeneration.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map=None,
)
processor = AutoProcessor.from_pretrained(model_name)
# Prepare model with FSDP
model = model.to(device)
text = "test this image <|vision_start|><|image_pad|><|vision_end|>"
image = [Image.new('RGB', (100, 100), color='red')]
# Prepare inputs
inputs = processor(
text = text,
images = image,
return_tensors="pt",
max_length=MAX_LENGTH,
truncation = "longest_first",
padding = True,
)
# Move inputs to device
inputs = {k: v.to(device) for k, v in inputs.items()}
inputs["labels"] = inputs["input_ids"].clone()
outputs = model(**inputs)
```
### Expected behavior
I expect the script to either:
1. fail gracefully at tokenization time raising an error informing the user that the image tokens are being truncated and this is untennable, possibly with another kwarg telling it to ignore the error and return the broken tokens or
2. Truncate the image tokens, grid, and pixel values in a compatible way that works with the model forward. This is complicated by the other issue I raised: #37186
|
{
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37222/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37222/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37221
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37221/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37221/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37221/events
|
https://github.com/huggingface/transformers/issues/37221
| 2,967,518,124
|
I_kwDOCUB6oc6w4Lus
| 37,221
|
RWKV6-Finch-7B-HF crashes during inference
|
{
"login": "assafbk",
"id": 38749017,
"node_id": "MDQ6VXNlcjM4NzQ5MDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/38749017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/assafbk",
"html_url": "https://github.com/assafbk",
"followers_url": "https://api.github.com/users/assafbk/followers",
"following_url": "https://api.github.com/users/assafbk/following{/other_user}",
"gists_url": "https://api.github.com/users/assafbk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/assafbk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/assafbk/subscriptions",
"organizations_url": "https://api.github.com/users/assafbk/orgs",
"repos_url": "https://api.github.com/users/assafbk/repos",
"events_url": "https://api.github.com/users/assafbk/events{/privacy}",
"received_events_url": "https://api.github.com/users/assafbk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T20:49:34
| 2025-05-11T08:03:01
| 2025-05-11T08:03:01
|
NONE
| null | null | null | null |
### System Info
- `transformers` version: 4.50.3
- Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35
- Python version: 3.10.16
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1+cu118 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA RTX A6000
### Who can help?
@ArthurZucker, @gante
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Code snippet for reproduction:
```
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
def generate_prompt(instruction, input=""):
instruction = instruction.strip().replace('\r\n','\n').replace('\n\n','\n')
input = input.strip().replace('\r\n','\n').replace('\n\n','\n')
if input:
return f"""Instruction: {instruction}
Input: {input}
Response:"""
else:
return f"""User: hi
Assistant: Hi. I am your assistant and I will provide expert full response in full details. Please feel free to ask any question and I will always answer it.
User: {instruction}
Assistant:"""
model = AutoModelForCausalLM.from_pretrained("RWKV/v6-Finch-7B-HF", trust_remote_code=True, torch_dtype=torch.float16).to('cuda')
tokenizer = AutoTokenizer.from_pretrained("RWKV/v6-Finch-7B-HF", trust_remote_code=True)
text = "Hello how are you?"
prompt = generate_prompt(text)
inputs = tokenizer(prompt, return_tensors="pt").to('cuda')
output = model.generate(inputs["input_ids"], max_new_tokens=128, do_sample=False)
print(tokenizer.decode(output[0].tolist(), skip_special_tokens=True))
```
Error message:
> Traceback (most recent call last):
> File "/data2/assaf/tmp/rwkv_test.py", line 30, in <module>
> output = model.generate(inputs["input_ids"], max_new_tokens=128, do_sample=False)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
> return func(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/generation/utils.py", line 2326, in generate
> result = self._sample(
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/generation/utils.py", line 3286, in _sample
> outputs = self(**model_inputs, return_dict=True)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/home/assaf/.cache/huggingface/modules/transformers_modules/RWKV/v6-Finch-7B-HF/02946e470978a56fc96bc8c47cb942c50bc9c71a/modeling_rwkv6.py", line 716, in forward
> outputs = self.rwkv(
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/home/assaf/.cache/huggingface/modules/transformers_modules/RWKV/v6-Finch-7B-HF/02946e470978a56fc96bc8c47cb942c50bc9c71a/modeling_rwkv6.py", line 574, in forward
> hidden_states, state, attentions = block(
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/home/assaf/.cache/huggingface/modules/transformers_modules/RWKV/v6-Finch-7B-HF/02946e470978a56fc96bc8c47cb942c50bc9c71a/modeling_rwkv6.py", line 275, in forward
> attention, state = self.attention(self.ln1(hidden), state=state, use_cache=use_cache, seq_mode=seq_mode)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/home/assaf/.cache/huggingface/modules/transformers_modules/RWKV/v6-Finch-7B-HF/02946e470978a56fc96bc8c47cb942c50bc9c71a/modeling_rwkv6.py", line 192, in forward
> receptance, key, value, gate, time_decay, state = self.extract_key_value(hidden, state=state)
> File "/home/assaf/.cache/huggingface/modules/transformers_modules/RWKV/v6-Finch-7B-HF/02946e470978a56fc96bc8c47cb942c50bc9c71a/modeling_rwkv6.py", line 183, in extract_key_value
> time_decay = torch.tanh(time_decay @ self.time_decay_w1) @ self.time_decay_w2
> RuntimeError: expected mat1 and mat2 to have the same dtype, but got: c10::Half != float
### Expected behavior
Regular inference is expected, yet the script crashes with the error message above.
The bug was seen in transformers version v4.50.3
It does not reproduce on earlier transformers versions, such as v4.42.4
|
{
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37221/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37221/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37220
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37220/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37220/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37220/events
|
https://github.com/huggingface/transformers/pull/37220
| 2,967,482,920
|
PR_kwDOCUB6oc6RJehD
| 37,220
|
feat: support indivisible shards for TP model loading and TPlizing.
|
{
"login": "kmehant",
"id": 15800200,
"node_id": "MDQ6VXNlcjE1ODAwMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/15800200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kmehant",
"html_url": "https://github.com/kmehant",
"followers_url": "https://api.github.com/users/kmehant/followers",
"following_url": "https://api.github.com/users/kmehant/following{/other_user}",
"gists_url": "https://api.github.com/users/kmehant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kmehant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmehant/subscriptions",
"organizations_url": "https://api.github.com/users/kmehant/orgs",
"repos_url": "https://api.github.com/users/kmehant/repos",
"events_url": "https://api.github.com/users/kmehant/events{/privacy}",
"received_events_url": "https://api.github.com/users/kmehant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T20:33:01
| 2025-07-01T10:03:22
| 2025-07-01T10:03:22
|
CONTRIBUTOR
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37220",
"html_url": "https://github.com/huggingface/transformers/pull/37220",
"diff_url": "https://github.com/huggingface/transformers/pull/37220.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37220.patch",
"merged_at": "2025-07-01T10:03:22"
}
|
# What does this PR do?
Fixes https://github.com/huggingface/transformers/issues/37051
Approach is to support uneven sharding and seek segments of data that mimics torch.chunk since torch.chunk is the style of sharding adopted by torch `Shard` placements API for both even and uneven sharding. Finally we pass stride and shape to `from_local` to allow for uneven sharding.
```python
from transformers import AutoModelForCausalLM
import torch
m2 = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-Chat-v1.0", tp_plan=None)
m = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-Chat-v1.0", tp_plan="auto")
print(m.model.layers[0].self_attn.q_proj.weight)
print(m.model.layers[0].self_attn.q_proj.weight.shape)
ft = m.model.layers[0].self_attn.q_proj.weight.full_tensor().to("cpu")
assert torch.equal(ft, m2.model.layers[0].self_attn.q_proj.weight.to("cpu"))
# assert should pass
```
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case. https://github.com/huggingface/transformers/issues/37051
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @SunMarc @muellerzr
|
{
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37220/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37220/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37219
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37219/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37219/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37219/events
|
https://github.com/huggingface/transformers/issues/37219
| 2,967,425,685
|
I_kwDOCUB6oc6w31KV
| 37,219
|
RecurrentGemma crashes during inference for inputs longer than sliding window width
|
{
"login": "assafbk",
"id": 38749017,
"node_id": "MDQ6VXNlcjM4NzQ5MDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/38749017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/assafbk",
"html_url": "https://github.com/assafbk",
"followers_url": "https://api.github.com/users/assafbk/followers",
"following_url": "https://api.github.com/users/assafbk/following{/other_user}",
"gists_url": "https://api.github.com/users/assafbk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/assafbk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/assafbk/subscriptions",
"organizations_url": "https://api.github.com/users/assafbk/orgs",
"repos_url": "https://api.github.com/users/assafbk/repos",
"events_url": "https://api.github.com/users/assafbk/events{/privacy}",
"received_events_url": "https://api.github.com/users/assafbk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T20:04:58
| 2025-04-22T10:21:17
| 2025-04-22T10:21:17
|
NONE
| null | null | null | null |
### System Info
**System Info:**
- `transformers` version: 4.50.3
- Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35
- Python version: 3.10.16
- Huggingface_hub version: 0.30.1
- Safetensors version: 0.5.3
- Accelerate version: 1.6.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.5.1+cu118 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
- Using GPU in script?: yes
- GPU type: NVIDIA RTX A6000
### Who can help?
@ArthurZucker, @gante
### Information
- [x] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
Code snippet for reproduction:
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("google/recurrentgemma-9b-it")
model = AutoModelForCausalLM.from_pretrained("google/recurrentgemma-9b-it", device_map="cuda", torch_dtype=torch.float16)
input_text = "Write me a poem about Machine Learning." * 300 # This string is 2402 tokens long, which is larger than 2048, the sliding window attention width
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids, max_new_tokens=20)
print(tokenizer.decode(outputs[0]))
```
Error message:
> Traceback (most recent call last):
> File "/data2/assaf/tmp/test_rg.py", line 13, in <module>
> outputs = model.generate(**input_ids, max_new_tokens=20)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
> return func(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/generation/utils.py", line 2326, in generate
> result = self._sample(
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/generation/utils.py", line 3289, in _sample
> outputs = model_forward(**model_inputs, return_dict=True)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/models/recurrent_gemma/modeling_recurrent_gemma.py", line 852, in forward
> outputs = self.model(
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
> return self._call_impl(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
> return forward_call(*args, **kwargs)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/models/recurrent_gemma/modeling_recurrent_gemma.py", line 717, in forward
> causal_mask = self._update_causal_mask(attention_mask, inputs_embeds, cache_position)
> File "/data2/assaf/conda_envs/recurrent_gemma_tmp/lib/python3.10/site-packages/transformers/models/recurrent_gemma/modeling_recurrent_gemma.py", line 764, in _update_causal_mask
> padding_mask = causal_mask[..., :mask_length].eq(0.0) * attention_mask[:, None, None, :].eq(0.0)
> RuntimeError: The size of tensor a (2048) must match the size of tensor b (2402) at non-singleton dimension 3
### Expected behavior
If the sequence is longer than the sliding window width (like it is now in the script) then the script crashes with the error message above.
If the sequence is shorter than the sliding window width (e.g. replace *300 by *200) then the script runs fine.
The bug was seen in transformers version v4.50.3
It does not reproduce on earlier transformers versions, such as v4.42.4
|
{
"login": "manueldeprada",
"id": 6536835,
"node_id": "MDQ6VXNlcjY1MzY4MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6536835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/manueldeprada",
"html_url": "https://github.com/manueldeprada",
"followers_url": "https://api.github.com/users/manueldeprada/followers",
"following_url": "https://api.github.com/users/manueldeprada/following{/other_user}",
"gists_url": "https://api.github.com/users/manueldeprada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/manueldeprada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/manueldeprada/subscriptions",
"organizations_url": "https://api.github.com/users/manueldeprada/orgs",
"repos_url": "https://api.github.com/users/manueldeprada/repos",
"events_url": "https://api.github.com/users/manueldeprada/events{/privacy}",
"received_events_url": "https://api.github.com/users/manueldeprada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37219/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37219/timeline
| null |
completed
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
|
{
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
}
| false
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37218
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37218/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37218/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37218/events
|
https://github.com/huggingface/transformers/pull/37218
| 2,967,235,670
|
PR_kwDOCUB6oc6RIoz1
| 37,218
|
[CI] lazy loading external datasets
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T18:35:51
| 2025-04-03T08:57:48
| 2025-04-03T08:57:45
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37218",
"html_url": "https://github.com/huggingface/transformers/pull/37218",
"diff_url": "https://github.com/huggingface/transformers/pull/37218.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37218.patch",
"merged_at": "2025-04-03T08:57:45"
}
|
# What does this PR do?
Follow-up to #37202: add lazy loading to the class-level test dataset 🙌
With recent changes, any test command using a tester that inherits from `PipelineTesterMixin` will trigger the dataset loading for all pipeline tester classes, even if those datasets are never user. For instance, `py.test tests/models/llama/test_modeling_llama.py` was now loading all datasets, causing the test time to rise substantially (from 17 to 24 seconds on my machine).
This PR adds lazy loading to the datasets, i.e. load a dataset only when a related test is run. It's still loaded once per pytest process, keeping the benefits from the previous PRs 🤗
Note: `setUpClass` is often used for this, but the pipeline tests have a complex inheritance system (related PR: https://github.com/huggingface/transformers/pull/37214)
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37218/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37218/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37217
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37217/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37217/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37217/events
|
https://github.com/huggingface/transformers/pull/37217
| 2,967,107,272
|
PR_kwDOCUB6oc6RINZ7
| 37,217
|
[generate] prepare `attention_mask` outside `forward` whenever possible
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T17:39:20
| 2025-04-03T08:58:15
| 2025-04-03T08:58:15
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37217",
"html_url": "https://github.com/huggingface/transformers/pull/37217",
"diff_url": "https://github.com/huggingface/transformers/pull/37217.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37217.patch",
"merged_at": null
}
|
# What does this PR do?
WIP
For better use of `generate` + `flex_attention`
|
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37217/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37217/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37216
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37216/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37216/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37216/events
|
https://github.com/huggingface/transformers/pull/37216
| 2,967,061,369
|
PR_kwDOCUB6oc6RIDhB
| 37,216
|
Detect and use device context manager or global device in `from_pretrained`
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T17:15:05
| 2025-04-16T17:11:09
| 2025-04-15T07:59:20
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37216",
"html_url": "https://github.com/huggingface/transformers/pull/37216",
"diff_url": "https://github.com/huggingface/transformers/pull/37216.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37216.patch",
"merged_at": "2025-04-15T07:59:20"
}
|
# What does this PR do?
As per the title. See https://github.com/huggingface/transformers/pull/37213 as well.
Note that this will only ever be used when no `device_map` is provided, otherwise the `device_map` always take precedence.
This behavior should be aligned with what PyTorch does with contetx manager/globally setting the default device. It it also what will happen to the buffers (they are not initialized on meta during loading, so they will go to whatever default device torch knows based on context/global device), so makes total sense IMO (see the example).
This script:
```python
import torch
from transformers import AutoModelForCausalLM
with torch.device(2):
model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-8B-Instruct", torch_dtype=torch.float16)
unique_devices = {v.device for v in model.parameters()} | {v.device for v in model.buffers()}
print(unique_devices)
```
returns
```python
>>> {device(type='cuda', index=2)}
```
And before, only the non-persistent buffers would be moved (because we entirely control the rest)
```python
>>> {device(type='cuda', index=2), device(type='cpu')}
```
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37216/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37216/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37215
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37215/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37215/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37215/events
|
https://github.com/huggingface/transformers/pull/37215
| 2,967,039,997
|
PR_kwDOCUB6oc6RH-33
| 37,215
|
A bit of cleaning 🧹🧹
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T17:04:18
| 2025-04-08T12:33:59
| 2025-04-08T12:33:58
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37215",
"html_url": "https://github.com/huggingface/transformers/pull/37215",
"diff_url": "https://github.com/huggingface/transformers/pull/37215.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37215.patch",
"merged_at": "2025-04-08T12:33:58"
}
|
# What does this PR do?
I stumbled upon the 2 unused functions in `from_pretrained` and they are not used anywhere in the codebase.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37215/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37215/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37214
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37214/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37214/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37214/events
|
https://github.com/huggingface/transformers/pull/37214
| 2,966,958,056
|
PR_kwDOCUB6oc6RHtKy
| 37,214
|
:rotating_light::rotating_light::rotating_light: Replace SetUp with SetUpClass
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T16:26:07
| 2025-04-04T14:38:27
| 2025-04-04T14:38:26
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37214",
"html_url": "https://github.com/huggingface/transformers/pull/37214",
"diff_url": "https://github.com/huggingface/transformers/pull/37214.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37214.patch",
"merged_at": null
}
|
A lot of our test suites use `SetUp` and `TearDown` methods. These are called **before and after every test**. In a lot of cases, this is unnecessary because the contents do not change between tests. This is particularly important when the setup has expensive steps, like initializing models or processors, or loading test files/images from the Hub.
I made a PR at #37209 to change this for Fuyu, which was frequently timing out or being throttled because it kept loading the same image by URL from the Hub. This PR reduced the runtime of the Fuyu test suite on my local machine to 30% of the previous amount, so I figured I'd try it for other classes too.
This is a repo-wide search replace to turn setUp into setUpClass. It will break **everything**, but I'm hoping that with some tweaks we can make the CI pass.
|
{
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37214/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37214/timeline
| null | null | null | null | true
| true
|
https://api.github.com/repos/huggingface/transformers/issues/37213
|
https://api.github.com/repos/huggingface/transformers
|
https://api.github.com/repos/huggingface/transformers/issues/37213/labels{/name}
|
https://api.github.com/repos/huggingface/transformers/issues/37213/comments
|
https://api.github.com/repos/huggingface/transformers/issues/37213/events
|
https://github.com/huggingface/transformers/pull/37213
| 2,966,920,628
|
PR_kwDOCUB6oc6RHlJG
| 37,213
|
Fix test
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null |
[] | 2025-04-02T16:08:38
| 2025-04-03T08:24:36
| 2025-04-03T08:24:34
|
MEMBER
| null | null | false
|
{
"url": "https://api.github.com/repos/huggingface/transformers/pulls/37213",
"html_url": "https://github.com/huggingface/transformers/pull/37213",
"diff_url": "https://github.com/huggingface/transformers/pull/37213.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/37213.patch",
"merged_at": "2025-04-03T08:24:34"
}
|
# What does this PR do?
Device context manager is not the intended API to use with `from_pretrained`.
It used to work before, BUT ONLY with `low_cpu_mem_usage=False`, so without consistency.
Will try to investigate if we can easily detect the context and turch it into a simple `device_map` however.
|
{
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/huggingface/transformers/issues/37213/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/huggingface/transformers/issues/37213/timeline
| null | null | null | null | true
| true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.