url string | repository_url string | labels_url string | comments_url string | events_url string | html_url string | id int64 | node_id string | number int64 | title string | user dict | labels list | state string | locked bool | assignee dict | assignees list | milestone null | comments list | created_at timestamp[ms] | updated_at timestamp[ms] | closed_at timestamp[ms] | author_association string | type dict | active_lock_reason null | draft bool | pull_request dict | body string | closed_by dict | reactions dict | timeline_url string | performed_via_github_app null | state_reason string | sub_issues_summary dict | issue_dependencies_summary dict | is_pull_request bool | is_closed bool |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/39637 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39637/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39637/comments | https://api.github.com/repos/huggingface/transformers/issues/39637/events | https://github.com/huggingface/transformers/issues/39637 | 3,260,241,704 | I_kwDOCUB6oc7CU1co | 39,637 | [BUG] Run 111B+ Teacher distributed inference and 8B Student distributed training on multi-node H200 GPUs using the Transformers Trainer without encountering OOM errors? | {
"login": "seona21",
"id": 169985436,
"node_id": "U_kgDOCiHFnA",
"avatar_url": "https://avatars.githubusercontent.com/u/169985436?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seona21",
"html_url": "https://github.com/seona21",
"followers_url": "https://api.github.com/users/seona21/followers",
"following_url": "https://api.github.com/users/seona21/following{/other_user}",
"gists_url": "https://api.github.com/users/seona21/gists{/gist_id}",
"starred_url": "https://api.github.com/users/seona21/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/seona21/subscriptions",
"organizations_url": "https://api.github.com/users/seona21/orgs",
"repos_url": "https://api.github.com/users/seona21/repos",
"events_url": "https://api.github.com/users/seona21/events{/privacy}",
"received_events_url": "https://api.github.com/users/seona21/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T15:05:38 | 2025-09-01T08:03:18 | 2025-09-01T08:03:18 | NONE | null | null | null | null | Hello, first off, apologies if this information is already available elsewhere. I've searched through the documentation and existing issues but haven't found a clear answer to my question.
I have access to 2 to 4 nodes (16 to 32 GPUs in total), each equipped with 8x140GB H200 GPUs. My objective is to perform large-scale distributed inference using a massive 111B-parameter Teacher model (CohereLabs/c4ai-command-a-03-2025) and simultaneously conduct online knowledge distillation (soft-logit based) from this 111B Teacher model to a smaller 8B Student model (CohereLabs/c4ai-command-r7b-12-2024).
Is there a way to simultaneously run distributed inference for Teacher models larger than 111B and distributed training for Student models in a multi-node setup, utilizing Hugging Face Transformers' Trainer?
The Transformers version I'm using is v4.51.3. I've observed the use of model = deepspeed.tp_model_init within the def deepspeed_init function in src/transformers/integrations/deepspeed.py. I attempted to apply this code, but it resulted in a torch.distributed.DistBackendError.
I would be very grateful if someone could explain what would be most suitable for my use case. A minimal working example would be the icing on the cake. Surely, if the Open LLM Leaderboard shows that online knowledge distillation (soft-logit) is possible with large models exceeding 111B, there must be a straightforward way to achieve what I want, but I'm unsure how everyone else does it.
For reference, below is the script I'm currently working with:
`deepspeed --num_nodes 2 --num_gpus 8 \
--hostfile $HOSTFILE \
--master_addr $MASTER_ADDR \
--master_port=62535 \
train.py \
--teacher CohereLabs/c4ai-command-a-03-2025 \
--student CohereLabs/c4ai-command-r7b-12-2024 \
--epochs 1 --batch_size 1 --seq_len 4096 --temperature 1.0 --max_samples 150 --lr 1e-6 2>&1 | tee -a "./train.log" `
```import deepspeed
import torch.distributed as dist
import os, math, argparse, warnings, torch, random, multiprocessing as mp
from datasets import load_dataset, concatenate_datasets
from transformers import (AutoTokenizer, AutoModelForCausalLM,
PreTrainedTokenizerBase)
from torch.nn.utils.rnn import pad_sequence
import torch.nn.functional as F
from datetime import timedelta
from deepspeed.runtime.utils import see_memory_usage
os.environ["TOKENIZERS_PARALLELISM"] = "false"
os.environ.setdefault("NCCL_ASYNC_ERROR_HANDLING", "1")
warnings.filterwarnings("ignore", category=UserWarning)
mp.set_start_method("spawn", force=True)
def get_args():
p = argparse.ArgumentParser()
p.add_argument("--teacher", default="")
p.add_argument("--student", default="")
p.add_argument("--dataset", default="")
p.add_argument("--split", default="train")
p.add_argument("--epochs", type=int, default=1)
p.add_argument("--batch_size", type=int, default=1,
help="per-GPU micro-batch")
p.add_argument("--seq_len", type=int, default=4096)
p.add_argument("--temperature", type=float, default=1.0)
p.add_argument("--lr", type=float, default=1e-6)
p.add_argument("--max_samples", type=int, default=0,
help="0=1000 ")
p.add_argument("--local_rank", type=int, default=-1,
help="deepspeed/torch launcher GPU index")
p.add_argument("--cache_path", default="")
p.add_argument("--hf_token", default="")
p = deepspeed.add_config_arguments(p)
return p.parse_args()
def main():
timeout_seconds = 3600
timeout_duration = timedelta(seconds=timeout_seconds)
dist.init_process_group(
backend="nccl",
timeout=timeout_duration
)
args = get_args()
deepspeed.init_distributed()
rank, world = deepspeed.comm.get_rank(), deepspeed.comm.get_world_size()
device = torch.device("cuda", deepspeed.comm.get_local_rank())
# Tokenizer
tokenizer = AutoTokenizer.from_pretrained(args.student,
use_fast=True, trust_remote_code=True)
if tokenizer.pad_token is None:
tokenizer.pad_token = tokenizer.eos_token
# tokenizer token_id
tokenizer.eos_token_id = tokenizer.convert_tokens_to_ids(tokenizer.eos_token)
tokenizer.pad_token_id = tokenizer.convert_tokens_to_ids(tokenizer.pad_token)
# Teacher (inference only)
teacher_model = AutoModelForCausalLM.from_pretrained(
args.teacher, torch_dtype=torch.bfloat16,
low_cpu_mem_usage=True,
trust_remote_code=True, device_map=None,
cache_dir=args.cache_path,token=args.hf_token)
see_memory_usage("After load model", force=True)
teacher_model.config.eos_token_id = tokenizer.eos_token_id
teacher_model.config.pad_token_id = tokenizer.pad_token_id
teacher_engine = deepspeed.init_inference(
teacher_model,
mp_size=world,
dtype=torch.bfloat16,
replace_with_kernel_inject=True,
replace_method="auto")
see_memory_usage("After DS-inference init", force=True)
teacher_engine.module.eval()
teacher_engine.optimizer = None
# Student
student_model = AutoModelForCausalLM.from_pretrained(
args.student, torch_dtype=torch.bfloat16,
attn_implementation="flash_attention_2",
trust_remote_code=True, cache_dir=args.cache_path,token=args.hf_token)
student_model.config.eos_token_id = tokenizer.eos_token_id
student_model.config.pad_token_id = tokenizer.pad_token_id
# Dataset
ds = [
load_dataset("Raphael21/LogicKor_Aug_small_v2", split="train", data_dir="v0.1.1", streaming=False)
]
ds = concatenate_datasets(ds).select_columns(["messages"])
total_samples = args.max_samples or len(ds)
total_steps = args.epochs * math.ceil(total_samples / args.batch_size)
# Deepspeed Config
ds_cfg = {
"train_batch_size": args.batch_size * world,
"gradient_accumulation_steps": 1,
"bf16": {"enabled": True},
"zero_optimization": {
"stage": 3,
"stage3_max_live_parameters": 1e9,
"stage3_prefetch_bucket_size": 5e8,
"stage3_param_persistence_threshold": 1e4,
"overlap_comm": True,
"contiguous_gradients": True,
"allgather_bucket_size": 5e8,
"reduce_bucket_size": 5e8,
"offload_optimizer": {
"device": "cpu",
"pin_memory": True
},
"offload_param": {
"device": "cpu",
"pin_memory": True
},
},
"activation_checkpointing": {
"partition_activations": True,
"contiguous_memory_optimization": True
},
"optimizer": {
"type": "AdamW",
"params": {
"lr": args.lr,
"betas": [0.9, 0.999],
"eps": 1e-8,
"weight_decay": 0.01
}
},
"scheduler": {
"type": "WarmupLR",
"params": {
"warmup_min_lr": 0,
"warmup_max_lr": args.lr,
"warmup_num_steps": int(0.1 * total_steps)
}
}
}
student_model = deepspeed.tp_model_init(
model=student_model,
tp_size=16, # tp_size
dtype=torch.bfloat16
)
if not hasattr(student_engine, "optimizer"):
student_engine.optimizer = None
# Debug Messages
if rank == 0:
print("Configured with ZeRO-3, total_steps:", total_steps)
# Data Loader
def preprocess_batch(examples):
prompt_key = "prompt"
messages_key = "messages"
ignore_index = -100
max_length = min(tokenizer.model_max_length, 4096)
def get_tokens_from_chat_template(messages_dict_or_str, add_gen_prompt, max_len=max_length):
tokens = tokenizer.apply_chat_template(
messages_dict_or_str,
tokenize=True,
add_generation_prompt=add_gen_prompt,
truncation=True,
max_length=max_len,
padding="do_not_pad",
return_tensors=None,
)
return tokens, [1] * len(tokens)
results = {
"input_ids": [],
"attention_mask": [],
"labels": [],
"prompts": [],
"prompt_attention_mask": [],
}
prompts = examples.get(prompt_key, [None] * len(examples[messages_key]))
for message, prompt in zip(examples[messages_key], prompts):
if prompt is None:
prompt_messages = message[:-1]
prompt_ids, prompt_attn = get_tokens_from_chat_template(prompt_messages, add_gen_prompt=True)
else:
prompt_ids, prompt_attn = get_tokens_from_chat_template(prompt_text, add_gen_prompt=True)
input_ids, attn_mask = get_tokens_from_chat_template(message, add_gen_prompt=False)
label = [ignore_index] * len(input_ids)
start_idx = len(prompt_ids)
if start_idx < len(input_ids):
for i in range(start_idx, len(input_ids)):
label[i] = input_ids[i]
results["input_ids"].append(input_ids)
results["attention_mask"].append(attn_mask)
results["labels"].append(label)
results["prompts"].append(prompt_ids)
results["prompt_attention_mask"].append(prompt_attn)
return results
def chatml_collate_fn(batch, pad_token_id=0, ignore_index=-100, max_length=4096):
def pad_and_truncate(seqs, pad_val, max_length):
return torch.stack([
torch.tensor(seq[:max_length] + [pad_val] * (max_length - len(seq)), dtype=torch.long)
for seq in seqs
])
fields = ["input_ids", "attention_mask", "labels", "prompts", "prompt_attention_mask"]
pad_values = [pad_token_id, 0, ignore_index, pad_token_id, 0]
return {
field: pad_and_truncate([ex[field] for ex in batch], pad_val, max_length)
for field, pad_val in zip(fields, pad_values)
}
if args.max_samples:
ds = ds.select(range(args.max_samples))
ds = ds.map(preprocess_batch, batched=True)
loader = torch.utils.data.DataLoader(
ds,
batch_size=args.batch_size,
shuffle=True,
pin_memory=True,
collate_fn=lambda x: chatml_collate_fn(
x,
pad_token_id=tokenizer.pad_token_id,
ignore_index=-100,
max_length=args.seq_len
)
)
T = args.temperature
for epoch in range(args.epochs):
for step, batch in enumerate(loader):
prompt_lengths_batch = batch["prompt_attention_mask"].sum(dim=1).cpu().tolist()
prompt_lengths_tensor = torch.tensor(prompt_lengths_batch, device=device, dtype=torch.long)
input_ids = batch["input_ids"].to(device)
attn = batch["attention_mask"].to(device)
labels_batch = batch["labels"].to(device)
with torch.no_grad():
teacher_logits = teacher_engine.module(
input_ids=input_ids,
attention_mask=attn,
use_cache=False
).logits
student_logits = student_engine(
input_ids=input_ids,
attention_mask=attn,
use_cache=False
).logits
if rank == 0:
sample_idx_to_inspect = 0
original_input_ids = batch["input_ids"][sample_idx_to_inspect].cpu().tolist()
original_labels_list = batch["labels"][sample_idx_to_inspect].cpu().tolist()
# Student Model (argmax)
student_predictions_ids = student_logits[sample_idx_to_inspect].argmax(dim=-1).cpu().tolist()
decoded_student_predictions = [tokenizer.decode([t], skip_special_tokens=False) for t in student_predictions_ids]
# Teacher Model (argmax)
teacher_predictions_ids = teacher_logits[sample_idx_to_inspect].argmax(dim=-1).cpu().tolist()
decoded_teacher_predictions = [tokenizer.decode([t], skip_special_tokens=False) for t in teacher_predictions_ids]
print(f"Decoded Student Predictions: {''.join(decoded_student_predictions[:100])} ...")
print(f"Decoded Teacher Predictions: {''.join(decoded_teacher_predictions[:100])} ...")
shifted_student_logits = student_logits[:, :-1, :]
shifted_teacher_logits = teacher_logits[:, :-1, :]
shifted_labels = labels_batch[:, 1:]
shifted_attention_mask = attn[:, 1:]
current_seq_len = shifted_labels.size(1)
response_mask = torch.zeros_like(shifted_labels, dtype=torch.bool)
for i in range(args.batch_size):
start_response_idx_in_shifted = prompt_lengths_tensor[i] - 1
start_response_idx_in_shifted = max(0, start_response_idx_in_shifted)
if start_response_idx_in_shifted < current_seq_len:
response_mask[i, start_response_idx_in_shifted:] = True
shifted_attention_mask = shifted_attention_mask & response_mask
# Apply temperature scaling
student_logits_scaled = shifted_student_logits / args.temperature
teacher_logits_scaled = shifted_teacher_logits / args.temperature
# Compute log probabilities for student and probabilities for teacher
student_log_probs = F.log_softmax(student_logits_scaled, dim=-1)
teacher_log_probs = F.log_softmax(teacher_logits_scaled, dim=-1)
kd_loss = F.kl_div(student_log_probs, teacher_log_probs, reduction="none", log_target=True)
kd_loss_per_token = kd_loss.sum(dim=-1)
valid_labels_mask = (shifted_labels != -100)
combined_mask = shifted_attention_mask & valid_labels_mask
masked_kd_loss = kd_loss_per_token * combined_mask
num_valid_tokens = combined_mask.sum()
if num_valid_tokens > 0:
kd_loss = masked_kd_loss.sum() / num_valid_tokens
else:
kd_loss = torch.tensor(0.0, device=device, requires_grad=True)
# Cross-Entropy Loss
ce_loss = F.cross_entropy(
shifted_student_logits.view(-1, shifted_student_logits.size(-1)), # (B*S, V)
shifted_labels.view(-1),
ignore_index=-100
)
if ce_loss.numel() == 0 or torch.isnan(ce_loss):
ce_loss = torch.tensor(0.0, device=device, requires_grad=True)
alpha = 0.5
total_loss = alpha * ce_loss + (1 - alpha) * kd_loss
student_engine.backward(total_loss)
student_engine.step()
# empty cache
torch.cuda.empty_cache()
if rank == 0 and step % 1 == 0:
print(f"[Epoch {epoch}][{step}/{len(loader)}] total_loss = {total_loss.item():.4f}, ce_loss = {ce_loss.item():.4f}, kd_loss = {kd_loss.item():.4f}")
# Save Checkpoint
student_engine.save_checkpoint("./save_checkpoint")
if __name__ == "__main__":
main()```
| {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39637/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39637/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39636 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39636/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39636/comments | https://api.github.com/repos/huggingface/transformers/issues/39636/events | https://github.com/huggingface/transformers/pull/39636 | 3,260,187,915 | PR_kwDOCUB6oc6geL8g | 39,636 | Fixes the BC | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T14:50:44 | 2025-07-25T16:41:23 | 2025-07-25T16:41:21 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39636",
"html_url": "https://github.com/huggingface/transformers/pull/39636",
"diff_url": "https://github.com/huggingface/transformers/pull/39636.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39636.patch",
"merged_at": "2025-07-25T16:41:21"
} | # What does this PR do?
Fix #39558
The bug fixe now shows:
```python
Number of returned hidden states: 17
hidden state 0 values (slice): tensor([0.0045, 0.0166, 0.0210])
hidden state 1 values (slice): tensor([-0.0010, 0.0339, 0.0018])
hidden state 2 values (slice): tensor([ 0.2473, -0.5012, 1.5765])
hidden state 3 values (slice): tensor([ 0.2444, -0.5238, 1.5824])
hidden state 4 values (slice): tensor([ 0.2463, -0.5339, 1.5775])
hidden state 5 values (slice): tensor([ 0.2797, -0.5208, 1.6037])
hidden state 6 values (slice): tensor([ 0.2424, -0.4786, 1.6114])
hidden state 7 values (slice): tensor([ 0.2291, -0.4361, 1.5910])
hidden state 8 values (slice): tensor([ 0.2474, -0.3626, 1.5842])
hidden state 9 values (slice): tensor([ 0.2487, -0.3277, 1.4880])
hidden state 10 values (slice): tensor([ 0.1552, -0.2746, 1.4607])
hidden state 11 values (slice): tensor([ 0.1235, -0.2442, 1.3794])
hidden state 12 values (slice): tensor([ 0.1107, -0.2346, 1.3300])
hidden state 13 values (slice): tensor([ 0.0789, -0.2590, 1.3076])
hidden state 14 values (slice): tensor([ 0.3292, -0.3711, 1.1052])
hidden state 15 values (slice): tensor([ 0.4628, -0.6839, 1.1164])
hidden state 16 values (slice): tensor([ 0.2918, -0.4089, 1.8382])
```
This will not work for gemma, but the effort for gemma is to record the hidden states before the forward, a hack but can be done if this break anyone's workflow:
```python
def make_capture_wrapper(module, orig_forward, key, index):
@wraps(orig_forward)
def wrapped_forward(*args, **kwargs):
if "hidden_states" in collected_outputs:
collected_outputs[key] += args[0]
output = orig_forward(*args, **kwargs)
if not isinstance(output, tuple):
collected_outputs[key] += (output,)
elif output[index] is not None:
collected_outputs[key] += (output[index],)
return output
return wrapped_forward
```
something like that 😉
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39636/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39636/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39635 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39635/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39635/comments | https://api.github.com/repos/huggingface/transformers/issues/39635/events | https://github.com/huggingface/transformers/pull/39635 | 3,260,049,918 | PR_kwDOCUB6oc6gdtr8 | 39,635 | Make pytorch examples UV-compatible | {
"login": "lhoestq",
"id": 42851186,
"node_id": "MDQ6VXNlcjQyODUxMTg2",
"avatar_url": "https://avatars.githubusercontent.com/u/42851186?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lhoestq",
"html_url": "https://github.com/lhoestq",
"followers_url": "https://api.github.com/users/lhoestq/followers",
"following_url": "https://api.github.com/users/lhoestq/following{/other_user}",
"gists_url": "https://api.github.com/users/lhoestq/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lhoestq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lhoestq/subscriptions",
"organizations_url": "https://api.github.com/users/lhoestq/orgs",
"repos_url": "https://api.github.com/users/lhoestq/repos",
"events_url": "https://api.github.com/users/lhoestq/events{/privacy}",
"received_events_url": "https://api.github.com/users/lhoestq/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T14:10:44 | 2025-07-25T08:46:24 | 2025-07-25T08:46:22 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39635",
"html_url": "https://github.com/huggingface/transformers/pull/39635",
"diff_url": "https://github.com/huggingface/transformers/pull/39635.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39635.patch",
"merged_at": "2025-07-25T08:46:22"
} | ... by adding the dependencies headers
I'm adding them to all the pytorch examples except the "question-answering" folder since they're not single file scripts.
I also updated release.py to update the UV dependencies accordingly:
- for dev version, require `transformers @ git+https://github.com/huggingface/transformers`
- for release version, require `transformers==VERSION` | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39635/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39635/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39634 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39634/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39634/comments | https://api.github.com/repos/huggingface/transformers/issues/39634/events | https://github.com/huggingface/transformers/pull/39634 | 3,259,978,140 | PR_kwDOCUB6oc6gdd6n | 39,634 | Reorder serving docs | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T13:49:43 | 2025-08-05T06:43:07 | 2025-08-05T06:43:06 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39634",
"html_url": "https://github.com/huggingface/transformers/pull/39634",
"diff_url": "https://github.com/huggingface/transformers/pull/39634.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39634.patch",
"merged_at": "2025-08-05T06:43:06"
} | Initial reorganization of serving docs to have clearer, standalone examples | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39634/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39634/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39633 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39633/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39633/comments | https://api.github.com/repos/huggingface/transformers/issues/39633/events | https://github.com/huggingface/transformers/pull/39633 | 3,259,667,663 | PR_kwDOCUB6oc6gcZaz | 39,633 | Support `typing.Literal` as type of tool parameters or return value | {
"login": "grf53",
"id": 5508921,
"node_id": "MDQ6VXNlcjU1MDg5MjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/5508921?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/grf53",
"html_url": "https://github.com/grf53",
"followers_url": "https://api.github.com/users/grf53/followers",
"following_url": "https://api.github.com/users/grf53/following{/other_user}",
"gists_url": "https://api.github.com/users/grf53/gists{/gist_id}",
"starred_url": "https://api.github.com/users/grf53/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/grf53/subscriptions",
"organizations_url": "https://api.github.com/users/grf53/orgs",
"repos_url": "https://api.github.com/users/grf53/repos",
"events_url": "https://api.github.com/users/grf53/events{/privacy}",
"received_events_url": "https://api.github.com/users/grf53/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T12:09:01 | 2025-07-25T17:51:56 | 2025-07-25T17:51:28 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39633",
"html_url": "https://github.com/huggingface/transformers/pull/39633",
"diff_url": "https://github.com/huggingface/transformers/pull/39633.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39633.patch",
"merged_at": "2025-07-25T17:51:28"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
This PR adds support for using `typing.Literal` as a type hint of tool function written in python.
I realized that using `typing.Literal` for argument's type hint is currently not supported during my tries to use tools as the python function(referring [Document 'Tools and RAG'](https://huggingface.co/docs/transformers/v4.51.3/en/chat_extras)).
The error I got is:
```
in _parse_type_hint
raise TypeHintParsingException("Couldn't parse this type hint, likely due to a custom class or object: ", hint)
transformers.utils.chat_template_utils.TypeHintParsingException: ("Couldn't parse this type hint, likely due to a custom class or object: ", typing.Literal[...])
```
Fortunately, there is already a feature to deal with the 'complex' type hints like `list[str]`, `Union[List[int], Dict[int, str]]`.
So I just added case for `origin is Literal`, just putting its `args` as value of the key `"enum"` in json schema is fine.
Putting `"type"` in that was not a good idea, because we could accidentally filter a valid case when the types are heterogeneous in `args`. But a rough type validation could be valuable for the case using no static typing tool.
Additionally, as I see, there was a glitch in the related function `_get_json_schema_type(param_type: str)`, which is that the type hint of argument `param_type` need to be `type` instead of `str`. This tiny change is also in this PR.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@ArthurZucker @Rocketknight1
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39633/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39633/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39632 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39632/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39632/comments | https://api.github.com/repos/huggingface/transformers/issues/39632/events | https://github.com/huggingface/transformers/pull/39632 | 3,259,437,917 | PR_kwDOCUB6oc6gbm-o | 39,632 | fix dead NVIDIA link | {
"login": "lidanserj",
"id": 158493830,
"node_id": "U_kgDOCXJshg",
"avatar_url": "https://avatars.githubusercontent.com/u/158493830?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lidanserj",
"html_url": "https://github.com/lidanserj",
"followers_url": "https://api.github.com/users/lidanserj/followers",
"following_url": "https://api.github.com/users/lidanserj/following{/other_user}",
"gists_url": "https://api.github.com/users/lidanserj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lidanserj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lidanserj/subscriptions",
"organizations_url": "https://api.github.com/users/lidanserj/orgs",
"repos_url": "https://api.github.com/users/lidanserj/repos",
"events_url": "https://api.github.com/users/lidanserj/events{/privacy}",
"received_events_url": "https://api.github.com/users/lidanserj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-24T10:46:19 | 2025-07-25T17:16:40 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39632",
"html_url": "https://github.com/huggingface/transformers/pull/39632",
"diff_url": "https://github.com/huggingface/transformers/pull/39632.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39632.patch",
"merged_at": null
} | Hey team! Found and fixed dead link in src/transformers/models/mamba/modeling_mamba.py
https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py - old
https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/legacy/model/gpt_model.py - new | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39632/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39632/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39631 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39631/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39631/comments | https://api.github.com/repos/huggingface/transformers/issues/39631/events | https://github.com/huggingface/transformers/pull/39631 | 3,259,252,484 | PR_kwDOCUB6oc6ga-u6 | 39,631 | [serve] Add speech-to-text | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/followers",
"following_url": "https://api.github.com/users/gante/following{/other_user}",
"gists_url": "https://api.github.com/users/gante/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gante/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gante/subscriptions",
"organizations_url": "https://api.github.com/users/gante/orgs",
"repos_url": "https://api.github.com/users/gante/repos",
"events_url": "https://api.github.com/users/gante/events{/privacy}",
"received_events_url": "https://api.github.com/users/gante/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-24T09:46:39 | 2025-07-24T09:46:39 | null | MEMBER | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39631",
"html_url": "https://github.com/huggingface/transformers/pull/39631",
"diff_url": "https://github.com/huggingface/transformers/pull/39631.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39631.patch",
"merged_at": null
} | # What does this PR do?
WIP, needs #39454 to be merged first.
This PR adds speech-to-text (i.e. `v1/audio/speech`) to `transformers serve`.
⚠️ So far, I've only tested it with `sesame/csm-1b`. Won't probably work with other models atm, as each model has different processing abstractions | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39631/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39631/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39630 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39630/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39630/comments | https://api.github.com/repos/huggingface/transformers/issues/39630/events | https://github.com/huggingface/transformers/pull/39630 | 3,259,185,235 | PR_kwDOCUB6oc6gawSC | 39,630 | Rename huggingface_cli to hf | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T09:27:29 | 2025-07-25T12:10:06 | 2025-07-25T12:10:04 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39630",
"html_url": "https://github.com/huggingface/transformers/pull/39630",
"diff_url": "https://github.com/huggingface/transformers/pull/39630.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39630.patch",
"merged_at": "2025-07-25T12:10:04"
} | Adapt the docs to `huggingface_hub` PR https://github.com/huggingface/huggingface_hub/pull/3229 | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39630/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39630/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39629 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39629/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39629/comments | https://api.github.com/repos/huggingface/transformers/issues/39629/events | https://github.com/huggingface/transformers/pull/39629 | 3,259,165,893 | PR_kwDOCUB6oc6gasFU | 39,629 | [processors] add tests for helper fn | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T09:21:30 | 2025-07-28T09:41:59 | 2025-07-28T09:41:59 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39629",
"html_url": "https://github.com/huggingface/transformers/pull/39629",
"diff_url": "https://github.com/huggingface/transformers/pull/39629.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39629.patch",
"merged_at": "2025-07-28T09:41:59"
} | # What does this PR do?
As per title, so no-one deleted it accidentally 😄 | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39629/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39629/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39628 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39628/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39628/comments | https://api.github.com/repos/huggingface/transformers/issues/39628/events | https://github.com/huggingface/transformers/pull/39628 | 3,259,111,885 | PR_kwDOCUB6oc6gagSZ | 39,628 | 🌐 [i18n-KO] Translated '<text-to-speech>.md' to Korean | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T09:05:18 | 2025-07-29T08:16:48 | 2025-07-29T08:16:48 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39628",
"html_url": "https://github.com/huggingface/transformers/pull/39628",
"diff_url": "https://github.com/huggingface/transformers/pull/39628.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39628.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다! -->
# What does this PR do?
Translated the `<text-to-speech>.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에, 이 아래에 리뷰를 요청할 팀원들을 멘션해주세요! -->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. 팀원들과 리뷰가 끝난 후에만 허깅페이스 직원들에게 리뷰 요청하는 아래 주석을 노출해주세요! --> | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39628/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39628/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39627 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39627/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39627/comments | https://api.github.com/repos/huggingface/transformers/issues/39627/events | https://github.com/huggingface/transformers/issues/39627 | 3,259,031,305 | I_kwDOCUB6oc7CQN8J | 39,627 | [XPU] Model get OOM when loading models | {
"login": "Stonepia",
"id": 12094956,
"node_id": "MDQ6VXNlcjEyMDk0OTU2",
"avatar_url": "https://avatars.githubusercontent.com/u/12094956?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Stonepia",
"html_url": "https://github.com/Stonepia",
"followers_url": "https://api.github.com/users/Stonepia/followers",
"following_url": "https://api.github.com/users/Stonepia/following{/other_user}",
"gists_url": "https://api.github.com/users/Stonepia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Stonepia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Stonepia/subscriptions",
"organizations_url": "https://api.github.com/users/Stonepia/orgs",
"repos_url": "https://api.github.com/users/Stonepia/repos",
"events_url": "https://api.github.com/users/Stonepia/events{/privacy}",
"received_events_url": "https://api.github.com/users/Stonepia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T08:39:59 | 2025-07-29T14:06:53 | 2025-07-29T14:06:53 | CONTRIBUTOR | null | null | null | null | When `caching_allocator_warmup()` trying to allocate memory, XPU lacks of a logic of query the existing memory, so that it gets OOM on:
https://github.com/huggingface/transformers/blob/947a37e8f5bc50bc0e9a77c0d16b038adcb056d0/src/transformers/modeling_utils.py#L6136-L6152
We need to add similar check for XPU as well.
### Example Error Msg
For example, when running
```
allocate byte_count: 29540067328 factor: 4 byte_count//factor: 7385016832
Traceback (most recent call last):
File "D:\mint\modelzoo\models_v2\pytorch\llama\inference\cpu\inductor\run_generation.py", line 149, in <module>
model = model_class[0].from_pretrained(
File "C:\Users\ai01\miniforge3\envs\ao\lib\site-packages\transformers\models\auto\auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
File "C:\Users\ai01\miniforge3\envs\ao\lib\site-packages\transformers\modeling_utils.py", line 279, in _wrapper
return func(*args, **kwargs)
File "C:\Users\ai01\miniforge3\envs\ao\lib\site-packages\transformers\modeling_utils.py", line 4399, in from_pretrained
) = cls._load_pretrained_model(
File "C:\Users\ai01\miniforge3\envs\ao\lib\site-packages\transformers\modeling_utils.py", line 4793, in _load_pretrained_model
caching_allocator_warmup(model_to_load, expanded_device_map, factor=2 if hf_quantizer is None else 4)
File "C:\Users\ai01\miniforge3\envs\ao\lib\site-packages\transformers\modeling_utils.py", line 5805, in caching_allocator_warmup
_ = torch.empty(byte_count // factor, dtype=torch.float16, device=device, requires_grad=False)
torch.OutOfMemoryError: XPU out of memory. Tried to allocate 13.76 GiB. GPU 0 has a total capacity of 11.60 GiB. Of the allocated memory 0 bytes is allocated by PyTorch, and 0 bytes is reserved by PyTorch but unallocated. Please use `empty_cache` to release all unoccupied cached memory.
```
### Additional Info:
1. Currently, for BMG, there is a known bug for torch.xpu.mem_get_info() https://github.com/pytorch/pytorch/issues/157989
2. There is a ongoing PR to unify the query API to `torch.accelerator.mem_get_info()` https://github.com/pytorch/pytorch/pull/156812 | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/followers",
"following_url": "https://api.github.com/users/ydshieh/following{/other_user}",
"gists_url": "https://api.github.com/users/ydshieh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ydshieh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ydshieh/subscriptions",
"organizations_url": "https://api.github.com/users/ydshieh/orgs",
"repos_url": "https://api.github.com/users/ydshieh/repos",
"events_url": "https://api.github.com/users/ydshieh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ydshieh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39627/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39627/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39626 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39626/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39626/comments | https://api.github.com/repos/huggingface/transformers/issues/39626/events | https://github.com/huggingface/transformers/pull/39626 | 3,259,026,278 | PR_kwDOCUB6oc6gaNjH | 39,626 | 🌐 [i18n-KO] Translated `text-to-speech.md` to Korean | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T08:38:17 | 2025-07-24T08:48:51 | 2025-07-24T08:48:51 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39626",
"html_url": "https://github.com/huggingface/transformers/pull/39626",
"diff_url": "https://github.com/huggingface/transformers/pull/39626.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39626.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다! -->
# What does this PR do?
Translated the `<text-to-speech>.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에, 이 아래에 리뷰를 요청할 팀원들을 멘션해주세요! -->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. 팀원들과 리뷰가 끝난 후에만 허깅페이스 직원들에게 리뷰 요청하는 아래 주석을 노출해주세요! --> | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39626/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39626/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39625 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39625/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39625/comments | https://api.github.com/repos/huggingface/transformers/issues/39625/events | https://github.com/huggingface/transformers/pull/39625 | 3,259,021,643 | PR_kwDOCUB6oc6gaMkU | 39,625 | Fix: allow Union[str, dict, None] fields like deepspeed to be passed via CLI | {
"login": "JH-debug",
"id": 56110972,
"node_id": "MDQ6VXNlcjU2MTEwOTcy",
"avatar_url": "https://avatars.githubusercontent.com/u/56110972?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JH-debug",
"html_url": "https://github.com/JH-debug",
"followers_url": "https://api.github.com/users/JH-debug/followers",
"following_url": "https://api.github.com/users/JH-debug/following{/other_user}",
"gists_url": "https://api.github.com/users/JH-debug/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JH-debug/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JH-debug/subscriptions",
"organizations_url": "https://api.github.com/users/JH-debug/orgs",
"repos_url": "https://api.github.com/users/JH-debug/repos",
"events_url": "https://api.github.com/users/JH-debug/events{/privacy}",
"received_events_url": "https://api.github.com/users/JH-debug/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-24T08:36:34 | 2025-07-24T14:51:05 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39625",
"html_url": "https://github.com/huggingface/transformers/pull/39625",
"diff_url": "https://github.com/huggingface/transformers/pull/39625.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39625.patch",
"merged_at": null
} | This PR fixes an issue in `HfArgumentParser` that causes a `ValueError` when parsing fields of type Union[str, dict, None] from the command line, such as the `deepspeed` argument in TrainingArguments.
The original parser rejected this valid type signature, even though it is commonly used to support both JSON config files (via path strings) and dictionary input. This PR adds a specific check to treat such fields as str when parsed via CLI, while still allowing JSON/dict parsing from file-based or programmatic inputs.
### Motivation & Context
When passing `--deepspeed path/to/ds_config.json` from the CLI, the parser raised the following error:
```ValueError: Only Union[X, NoneType] (i.e., Optional[X]) is allowed for Union ...```
This was because `Union[str, dict, None]` didn't meet the parser’s strict rule of "only one type allowed" in unions. However, this pattern is standard in transformers for flexible inputs (e.g., both string path and dict configs).
This patch modifies HfArgumentParser._parse_dataclass_field() to detect this specific case and allow it.
### Fixes
Closes a CLI parsing limitation for common use cases involving fields like:
`deepspeed: Optional[Union[str, dict]] = field(default=None, ...)`
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39625/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39624 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39624/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39624/comments | https://api.github.com/repos/huggingface/transformers/issues/39624/events | https://github.com/huggingface/transformers/pull/39624 | 3,258,987,227 | PR_kwDOCUB6oc6gaFJq | 39,624 | Translated text-to-speech into Korean | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T08:24:18 | 2025-07-24T08:25:12 | 2025-07-24T08:25:12 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39624",
"html_url": "https://github.com/huggingface/transformers/pull/39624",
"diff_url": "https://github.com/huggingface/transformers/pull/39624.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39624.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다! -->
# What does this PR do?
Translated the `<text-to-speech>.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에, 이 아래에 리뷰를 요청할 팀원들을 멘션해주세요! -->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. 팀원들과 리뷰가 끝난 후에만 허깅페이스 직원들에게 리뷰 요청하는 아래 주석을 노출해주세요! --> | {
"login": "taemincode",
"id": 187865781,
"node_id": "U_kgDOCzKatQ",
"avatar_url": "https://avatars.githubusercontent.com/u/187865781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taemincode",
"html_url": "https://github.com/taemincode",
"followers_url": "https://api.github.com/users/taemincode/followers",
"following_url": "https://api.github.com/users/taemincode/following{/other_user}",
"gists_url": "https://api.github.com/users/taemincode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taemincode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taemincode/subscriptions",
"organizations_url": "https://api.github.com/users/taemincode/orgs",
"repos_url": "https://api.github.com/users/taemincode/repos",
"events_url": "https://api.github.com/users/taemincode/events{/privacy}",
"received_events_url": "https://api.github.com/users/taemincode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39624/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39624/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39623 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39623/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39623/comments | https://api.github.com/repos/huggingface/transformers/issues/39623/events | https://github.com/huggingface/transformers/pull/39623 | 3,258,773,772 | PR_kwDOCUB6oc6gZWxm | 39,623 | fix tensor device when loading state dict | {
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-24T07:02:41 | 2025-08-14T06:46:44 | 2025-08-14T06:46:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39623",
"html_url": "https://github.com/huggingface/transformers/pull/39623",
"diff_url": "https://github.com/huggingface/transformers/pull/39623.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39623.patch",
"merged_at": null
} | The tensor device is wrong when `device_map="auto"`
```python
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("hugging-quants/Meta-Llama-3.1-8B-Instruct-BNB-NF4", device_map="auto")
```
The [tensor_device](https://github.com/huggingface/transformers/blob/v4.53.3/src/transformers/modeling_utils.py#L759) turns out `"cpu"` which is not expected. I fix the correct tensor device and param device based on the device_map. | {
"login": "jiqing-feng",
"id": 107918818,
"node_id": "U_kgDOBm614g",
"avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiqing-feng",
"html_url": "https://github.com/jiqing-feng",
"followers_url": "https://api.github.com/users/jiqing-feng/followers",
"following_url": "https://api.github.com/users/jiqing-feng/following{/other_user}",
"gists_url": "https://api.github.com/users/jiqing-feng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiqing-feng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiqing-feng/subscriptions",
"organizations_url": "https://api.github.com/users/jiqing-feng/orgs",
"repos_url": "https://api.github.com/users/jiqing-feng/repos",
"events_url": "https://api.github.com/users/jiqing-feng/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiqing-feng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39623/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39623/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39622 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39622/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39622/comments | https://api.github.com/repos/huggingface/transformers/issues/39622/events | https://github.com/huggingface/transformers/pull/39622 | 3,257,948,957 | PR_kwDOCUB6oc6gWl95 | 39,622 | revert behavior of _prepare_from_posids | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 6202871275,
"node_id": "LA_kwDOCUB6oc8AAAABcbhN6w",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flash%20Attention",
"name": "Flash Attention",
"color": "201FF8",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-23T23:04:47 | 2025-07-24T18:31:01 | 2025-07-24T18:31:01 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39622",
"html_url": "https://github.com/huggingface/transformers/pull/39622",
"diff_url": "https://github.com/huggingface/transformers/pull/39622.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39622.patch",
"merged_at": "2025-07-24T18:31:00"
} | # What does this PR do?
#39474 causes the grad norm to become NaN during training. We don't need to revert the whole thing, but reverting just this function seems to fix the regression.
@ArthurZucker @SunMarc
pre-#39474 loss
<img width="957" height="496" alt="Screenshot 2025-07-23 at 5 28 24 PM" src="https://github.com/user-attachments/assets/f319791b-4f3b-489a-b7dd-dc0e2eb3112f" />
post-#39474 loss
<img width="1290" height="282" alt="Screenshot 2025-07-23 at 5 31 19 PM" src="https://github.com/user-attachments/assets/ee5fe4f9-5c42-4c7e-8f78-c6e14f960737" />
with this fix
<img width="1000" height="301" alt="Screenshot 2025-07-23 at 7 07 01 PM" src="https://github.com/user-attachments/assets/1dec04d5-5378-4f05-b57d-8a81a403ec28" />
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39622/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39622/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39621 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39621/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39621/comments | https://api.github.com/repos/huggingface/transformers/issues/39621/events | https://github.com/huggingface/transformers/pull/39621 | 3,257,650,042 | PR_kwDOCUB6oc6gVj-I | 39,621 | Fix EfficientLoFTR model id in tests | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/sbucaille/followers",
"following_url": "https://api.github.com/users/sbucaille/following{/other_user}",
"gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions",
"organizations_url": "https://api.github.com/users/sbucaille/orgs",
"repos_url": "https://api.github.com/users/sbucaille/repos",
"events_url": "https://api.github.com/users/sbucaille/events{/privacy}",
"received_events_url": "https://api.github.com/users/sbucaille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5769473378,
"node_id": "LA_kwDOCUB6oc8AAAABV-MtYg",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Vision",
"name": "Vision",
"color": "C079EF",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-23T20:38:28 | 2025-07-24T16:12:18 | 2025-07-24T09:41:06 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39621",
"html_url": "https://github.com/huggingface/transformers/pull/39621",
"diff_url": "https://github.com/huggingface/transformers/pull/39621.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39621.patch",
"merged_at": "2025-07-24T09:41:06"
} | # What does this PR do?
Fix the incorrect model id in the tests
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@qubvel | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39621/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39621/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39620 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39620/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39620/comments | https://api.github.com/repos/huggingface/transformers/issues/39620/events | https://github.com/huggingface/transformers/pull/39620 | 3,257,551,507 | PR_kwDOCUB6oc6gVOH1 | 39,620 | docs: Update EfficientLoFTR documentation | {
"login": "sbucaille",
"id": 24275548,
"node_id": "MDQ6VXNlcjI0Mjc1NTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/24275548?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sbucaille",
"html_url": "https://github.com/sbucaille",
"followers_url": "https://api.github.com/users/sbucaille/followers",
"following_url": "https://api.github.com/users/sbucaille/following{/other_user}",
"gists_url": "https://api.github.com/users/sbucaille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sbucaille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbucaille/subscriptions",
"organizations_url": "https://api.github.com/users/sbucaille/orgs",
"repos_url": "https://api.github.com/users/sbucaille/repos",
"events_url": "https://api.github.com/users/sbucaille/events{/privacy}",
"received_events_url": "https://api.github.com/users/sbucaille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T20:01:51 | 2025-07-29T20:55:23 | 2025-07-29T20:54:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39620",
"html_url": "https://github.com/huggingface/transformers/pull/39620",
"diff_url": "https://github.com/huggingface/transformers/pull/39620.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39620.patch",
"merged_at": "2025-07-29T20:54:44"
} | # What does this PR do?
Updates EfficientLoFTR model card for #36979
Also added a fix of model id in the tests let me know if that should be a seperate PR or not
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39620/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39620/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39619 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39619/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39619/comments | https://api.github.com/repos/huggingface/transformers/issues/39619/events | https://github.com/huggingface/transformers/issues/39619 | 3,257,434,020 | I_kwDOCUB6oc7CKH-k | 39,619 | FSDP v1 bug: trainer incorrectly uses an unwrapped model | {
"login": "YanjunChen329",
"id": 26952823,
"node_id": "MDQ6VXNlcjI2OTUyODIz",
"avatar_url": "https://avatars.githubusercontent.com/u/26952823?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YanjunChen329",
"html_url": "https://github.com/YanjunChen329",
"followers_url": "https://api.github.com/users/YanjunChen329/followers",
"following_url": "https://api.github.com/users/YanjunChen329/following{/other_user}",
"gists_url": "https://api.github.com/users/YanjunChen329/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YanjunChen329/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YanjunChen329/subscriptions",
"organizations_url": "https://api.github.com/users/YanjunChen329/orgs",
"repos_url": "https://api.github.com/users/YanjunChen329/repos",
"events_url": "https://api.github.com/users/YanjunChen329/events{/privacy}",
"received_events_url": "https://api.github.com/users/YanjunChen329/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-23T19:13:47 | 2025-08-31T08:02:47 | 2025-08-31T08:02:47 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.53.0
- Platform: Linux-6.8.0-52-generic-x86_64-with-glibc2.35
- Python version: 3.12.11
- Huggingface_hub version: 0.31.2
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu126 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA H100 80GB HBM3
### Who can help?
Trainer: @zach-huggingface @SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
This problem happens when we do distributed training with accelerate and trainer. First, create a simple script with standard training logic. Change the FSDP config to use version 1.
```
import torch
from transformers import (
AutoTokenizer,
AutoModelForCausalLM,
TrainingArguments,
Trainer,
DataCollatorForLanguageModeling
)
from datasets import Dataset
import numpy as np
from accelerate import Accelerator
# Initialize accelerator for FSDP
accelerator = Accelerator()
# Load Qwen model and tokenizer
model_name = "Qwen/Qwen2.5-0.5B" # Using smaller model for reproduction
print(f"Loading model: {model_name}")
tokenizer = AutoTokenizer.from_pretrained(model_name)
if tokenizer.pad_token is None:
tokenizer.pad_token = tokenizer.eos_token
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
device_map=None, # Let FSDP handle device placement
trust_remote_code=True
)
# Create dummy dataset
def create_dummy_dataset(size=1000, max_length=512):
"""Create dummy text data for training"""
dummy_texts = []
for i in range(size):
# Generate random text sequences
text = f"This is dummy training text number {i}. " * (np.random.randint(5, 20))
dummy_texts.append(text[:max_length])
return dummy_texts
print("Creating dummy datasets...")
train_texts = create_dummy_dataset(800, 256)
eval_texts = create_dummy_dataset(200, 256)
# Tokenize datasets
def tokenize_function(examples):
return tokenizer(
examples["text"],
truncation=True,
padding=True,
max_length=256,
return_tensors="pt"
)
train_dataset = Dataset.from_dict({"text": train_texts})
eval_dataset = Dataset.from_dict({"text": eval_texts})
train_dataset = train_dataset.map(tokenize_function, batched=True, remove_columns=["text"])
eval_dataset = eval_dataset.map(tokenize_function, batched=True, remove_columns=["text"])
# Data collator
data_collator = DataCollatorForLanguageModeling(
tokenizer=tokenizer,
mlm=False, # We're doing causal language modeling, not masked LM
)
# Compute metrics function (optional for reproduction)
def compute_metrics(eval_pred):
predictions, labels = eval_pred
# Simple perplexity calculation
predictions = predictions.reshape(-1, predictions.shape[-1])
labels = labels.reshape(-1)
# Filter out padding tokens
mask = labels != -100
predictions = predictions[mask]
labels = labels[mask]
loss = torch.nn.functional.cross_entropy(
torch.tensor(predictions),
torch.tensor(labels)
)
perplexity = torch.exp(loss).item()
return {"perplexity": perplexity}
# FSDP configuration dictionary (extracted from fsdp_config.yaml)
fsdp_config = {
"fsdp_sharding_strategy": "SHARD_GRAD_OP",
"fsdp_activation_checkpointing": False,
"fsdp_auto_wrap_policy": "TRANSFORMER_BASED_WRAP",
"fsdp_cpu_ram_efficient_loading": True,
"fsdp_offload_params": False,
"fsdp_reshard_after_forward": False,
"fsdp_state_dict_type": "SHARDED_STATE_DICT",
"fsdp_transformer_layer_cls_to_wrap": "Qwen2DecoderLayer", # Updated for Qwen2.5
"fsdp_version": 1,
}
# Training arguments with FSDP configuration
training_args = TrainingArguments(
output_dir="output",
learning_rate=2e-5,
per_device_train_batch_size=1,
per_device_eval_batch_size=1,
num_train_epochs=2,
fsdp_config=fsdp_config, # Use dictionary instead of file path
fsdp="full_shard",
weight_decay=0.01,
eval_strategy="epoch",
save_strategy="epoch",
load_best_model_at_end=True,
push_to_hub=False, # Set to False for reproduction
logging_steps=10,
eval_steps=50,
warmup_steps=100,
dataloader_num_workers=0, # Avoid multiprocessing issues in reproduction
remove_unused_columns=False,
report_to="none", # Disable wandb/tensorboard for minimal repro
bf16=True, # Enable bf16 mixed precision
)
print("Initializing Trainer...")
trainer = Trainer(
model=model,
args=training_args,
train_dataset=train_dataset,
eval_dataset=eval_dataset,
processing_class=tokenizer,
data_collator=data_collator,
compute_metrics=compute_metrics,
)
print("Starting training...")
trainer.train()
```
Run the training script with a simple command:
```
accelerate launch \
--num_processes 8 \
--mixed_precision bf16 \
--use_fsdp \
repro.py
```
An error will be thrown when we try to run forward on the embedding layer
```
RuntimeError: 'weight' must be 2-D
```
This error is caused by the trainer incorrectly using the unwrapped model in training, so it doesn't trigger the all gather calls of FSDP v1 properly and thus the weight shape is unexpected.
### Expected behavior
We expect the trainer to use the FSDP-wrapped model consistently, so that the error above does not happen.
If we print the model right before we call model.forward, we will see that the model is unwrapped:
```
Qwen2ForCausalLM(
(model): Qwen2Model(
(embed_tokens): Embedding(151936, 896)
(layers): ModuleList(
(0-23): 24 x Qwen2DecoderLayer(
(self_attn): Qwen2Attention(
(q_proj): Linear(in_features=896, out_features=896, bias=True)
(k_proj): Linear(in_features=896, out_features=128, bias=True)
(v_proj): Linear(in_features=896, out_features=128, bias=True)
(o_proj): Linear(in_features=896, out_features=896, bias=False)
)
(mlp): Qwen2MLP(
(gate_proj): Linear(in_features=896, out_features=4864, bias=False)
(up_proj): Linear(in_features=896, out_features=4864, bias=False)
(down_proj): Linear(in_features=4864, out_features=896, bias=False)
(act_fn): SiLU()
)
(input_layernorm): Qwen2RMSNorm((0,), eps=1e-06)
(post_attention_layernorm): Qwen2RMSNorm((0,), eps=1e-06)
)
)
(norm): Qwen2RMSNorm((0,), eps=1e-06)
(rotary_emb): Qwen2RotaryEmbedding()
)
(lm_head): Linear(in_features=896, out_features=151936, bias=False)
)
```
We expect it to be wrapped. After applying the fix in https://github.com/huggingface/transformers/pull/39617, we now use the wrapped model correctly:
```
FullyShardedDataParallel(
(_fsdp_wrapped_module): Qwen2ForCausalLM(
(model): Qwen2Model(
(embed_tokens): Embedding(151936, 896)
(layers): ModuleList(
(0-23): 24 x Qwen2DecoderLayer(
(self_attn): Qwen2Attention(
(q_proj): Linear(in_features=896, out_features=896, bias=True)
(k_proj): Linear(in_features=896, out_features=128, bias=True)
(v_proj): Linear(in_features=896, out_features=128, bias=True)
(o_proj): Linear(in_features=896, out_features=896, bias=False)
)
(mlp): Qwen2MLP(
(gate_proj): Linear(in_features=896, out_features=4864, bias=False)
(up_proj): Linear(in_features=896, out_features=4864, bias=False)
(down_proj): Linear(in_features=4864, out_features=896, bias=False)
(act_fn): SiLU()
)
(input_layernorm): Qwen2RMSNorm((0,), eps=1e-06)
(post_attention_layernorm): Qwen2RMSNorm((0,), eps=1e-06)
)
)
(norm): Qwen2RMSNorm((0,), eps=1e-06)
(rotary_emb): Qwen2RotaryEmbedding()
)
(lm_head): Linear(in_features=896, out_features=151936, bias=False)
)
)
``` | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39619/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39619/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39618 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39618/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39618/comments | https://api.github.com/repos/huggingface/transformers/issues/39618/events | https://github.com/huggingface/transformers/issues/39618 | 3,257,427,318 | I_kwDOCUB6oc7CKGV2 | 39,618 | SageAttention for attention implementation? | {
"login": "Many0therFunctions",
"id": 138616329,
"node_id": "U_kgDOCEMeCQ",
"avatar_url": "https://avatars.githubusercontent.com/u/138616329?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Many0therFunctions",
"html_url": "https://github.com/Many0therFunctions",
"followers_url": "https://api.github.com/users/Many0therFunctions/followers",
"following_url": "https://api.github.com/users/Many0therFunctions/following{/other_user}",
"gists_url": "https://api.github.com/users/Many0therFunctions/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Many0therFunctions/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Many0therFunctions/subscriptions",
"organizations_url": "https://api.github.com/users/Many0therFunctions/orgs",
"repos_url": "https://api.github.com/users/Many0therFunctions/repos",
"events_url": "https://api.github.com/users/Many0therFunctions/events{/privacy}",
"received_events_url": "https://api.github.com/users/Many0therFunctions/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-07-23T19:10:47 | 2025-07-25T12:30:37 | null | NONE | null | null | null | null | ### Feature request
I've noticed it's been a while now, but transformers still only has flash attention as the fastest attention backend for calls like these:
<img width="1307" height="780" alt="Image" src="https://github.com/user-attachments/assets/3f3d62f6-a166-4ca6-97a0-49263fd93299" />
Are there any plans to add sageattention as well?
### Motivation
It's become increasingly involved to have to monkey patch sage attention support for every new model that comes out, and for older models that used older versions of transformers, I've had to do unholy things like this:
<img width="1296" height="705" alt="Image" src="https://github.com/user-attachments/assets/c5f4ff6a-094a-48f4-9339-17de1ece43d0" />
### Your contribution
I have an example of a patch I had to do so I will upload that here
[llama_nar.py.txt](https://github.com/user-attachments/files/21393926/llama_nar.py.txt) | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39618/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39618/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39617 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39617/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39617/comments | https://api.github.com/repos/huggingface/transformers/issues/39617/events | https://github.com/huggingface/transformers/pull/39617 | 3,257,424,441 | PR_kwDOCUB6oc6gUyXQ | 39,617 | Fix FSDP v1 bug: trainer incorrectly uses an unwrapped model | {
"login": "YanjunChen329",
"id": 26952823,
"node_id": "MDQ6VXNlcjI2OTUyODIz",
"avatar_url": "https://avatars.githubusercontent.com/u/26952823?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YanjunChen329",
"html_url": "https://github.com/YanjunChen329",
"followers_url": "https://api.github.com/users/YanjunChen329/followers",
"following_url": "https://api.github.com/users/YanjunChen329/following{/other_user}",
"gists_url": "https://api.github.com/users/YanjunChen329/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YanjunChen329/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YanjunChen329/subscriptions",
"organizations_url": "https://api.github.com/users/YanjunChen329/orgs",
"repos_url": "https://api.github.com/users/YanjunChen329/repos",
"events_url": "https://api.github.com/users/YanjunChen329/events{/privacy}",
"received_events_url": "https://api.github.com/users/YanjunChen329/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-23T19:09:32 | 2025-07-23T19:15:16 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39617",
"html_url": "https://github.com/huggingface/transformers/pull/39617",
"diff_url": "https://github.com/huggingface/transformers/pull/39617.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39617.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue) https://github.com/huggingface/transformers/issues/39619
Currently, HG Trainer does not have the backward compatibility to support FSDP v1.
The bug happens at line 2378, which is a code path where both `delay_optimizer_creation` and `use_accelerator_prepare` are true. This seems to only be the case when FSDP v1 is used.
At line 2378, we set `self.model` to be the FSDPv1-wrapped model. However, at line 2403, we set `self.model` again to be `model`, which is the unwrapped model instance initialized at line 2361.
This bug makes the trainer use the unwrapped model in the following forward calls, which causes weight shape mismatch and uninitialization errors because all-gather calls in FSDP are not properly triggered.
This error can be fixed by replacing `self.model` with `model`, which makes it consistent with the other code paths
Example error:
[rank0]: RuntimeError: 'weight' must be 2-D
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@zach-huggingface, @SunMarc and @qgallouedec
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39617/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39617/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39616 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39616/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39616/comments | https://api.github.com/repos/huggingface/transformers/issues/39616/events | https://github.com/huggingface/transformers/issues/39616 | 3,257,330,653 | I_kwDOCUB6oc7CJuvd | 39,616 | Trainer: Error when folded metrics are saved | {
"login": "l-uuz",
"id": 56924246,
"node_id": "MDQ6VXNlcjU2OTI0MjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/56924246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/l-uuz",
"html_url": "https://github.com/l-uuz",
"followers_url": "https://api.github.com/users/l-uuz/followers",
"following_url": "https://api.github.com/users/l-uuz/following{/other_user}",
"gists_url": "https://api.github.com/users/l-uuz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/l-uuz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/l-uuz/subscriptions",
"organizations_url": "https://api.github.com/users/l-uuz/orgs",
"repos_url": "https://api.github.com/users/l-uuz/repos",
"events_url": "https://api.github.com/users/l-uuz/events{/privacy}",
"received_events_url": "https://api.github.com/users/l-uuz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-23T18:32:17 | 2025-08-31T08:02:50 | 2025-08-31T08:02:50 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.53.2
- Platform: Windows-10-10.0.19045-SP0
- Python version: 3.12.4
- Huggingface_hub version: 0.32.4
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1+cu128 (CUDA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: No
- Using GPU in script?: Yes+No
- GPU type: NVIDIA GeForce GTX 1050 Ti
### Who can help?
@zach-huggingface @SunMarc
### Information
- [x] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Steps to reproduce:
1. Use a standard training script with a text model and a trainer.
2. Use a `compute_metrics` function where a single metric returns multiple values, e.g. `f1` with `average=None` which produces values per class like `{'f1': array([0.75, 0.5 , 1. ])}`.
3. Call `trainer.train()` with `save_strategy` different to `"no"`
### Error trace
(confidential paths are omitted)
<details>
```
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
Cell In[3], [line 41](vscode-notebook-cell:?execution_count=3&line=41)
32 # https://huggingface.co/docs/transformers/v4.53.2/en/main_classes/trainer#transformers.TrainingArguments
33 trainer = Trainer(
34 model,
35 TrainingArguments(per_device_train_batch_size=1, per_device_eval_batch_size=1, save_total_limit=1, eval_strategy="epoch"),
(...) 38 compute_metrics=compute_metrics,
39 )
---> [41](vscode-notebook-cell:?execution_count=3&line=41) trainer.train()
File (omitted)\.venv\Lib\site-packages\transformers\trainer.py:2206, in Trainer.train(self, resume_from_checkpoint, trial, ignore_keys_for_eval, **kwargs)
2204 hf_hub_utils.enable_progress_bars()
2205 else:
-> [2206](file:///(omitted)/.venv/Lib/site-packages/transformers/trainer.py:2206) return inner_training_loop(
2207 args=args,
2208 resume_from_checkpoint=resume_from_checkpoint,
2209 trial=trial,
2210 ignore_keys_for_eval=ignore_keys_for_eval,
2211 )
File (omitted)\.venv\Lib\site-packages\transformers\trainer.py:2623, in Trainer._inner_training_loop(self, batch_size, args, resume_from_checkpoint, trial, ignore_keys_for_eval)
2621 self.state.epoch = epoch + (step + 1 + steps_skipped) / steps_in_epoch
2622 self.control = self.callback_handler.on_step_end(args, self.state, self.control)
-> [2623](file:///(omitted)/.venv/Lib/site-packages/transformers/trainer.py:2623) self._maybe_log_save_evaluate(
2624 tr_loss,
2625 grad_norm,
2626 model,
2627 trial,
2628 epoch,
2629 ignore_keys_for_eval,
2630 start_time,
2631 learning_rate=learning_rate,
2632 )
2633 else:
2634 self.control = self.callback_handler.on_substep_end(args, self.state, self.control)
File (omitted)\.venv\Lib\site-packages\transformers\trainer.py:3103, in Trainer._maybe_log_save_evaluate(self, tr_loss, grad_norm, model, trial, epoch, ignore_keys_for_eval, start_time, learning_rate)
3100 self.control.should_save = is_new_best_metric
3102 if self.control.should_save:
-> [3103](file:///(omitted)/.venv/Lib/site-packages/transformers/trainer.py:3103) self._save_checkpoint(model, trial)
3104 self.control = self.callback_handler.on_save(self.args, self.state, self.control)
File (omitted)\.venv\Lib\site-packages\transformers\trainer.py:3228, in Trainer._save_checkpoint(self, model, trial)
3226 else:
3227 self.state.stateful_callbacks[cb_name] = cb_state
-> [3228](file:///(omitted)/.venv/Lib/site-packages/transformers/trainer.py:3228) self.state.save_to_json(os.path.join(output_dir, TRAINER_STATE_NAME))
3230 if self.args.push_to_hub:
3231 self._push_from_checkpoint(output_dir)
File (omitted)\.venv\Lib\site-packages\transformers\trainer_callback.py:146, in TrainerState.save_to_json(self, json_path)
144 def save_to_json(self, json_path: str):
145 """Save the content of this instance in JSON format inside `json_path`."""
--> [146](file:///(omitted)/.venv/Lib/site-packages/transformers/trainer_callback.py:146) json_string = json.dumps(dataclasses.asdict(self), indent=2, sort_keys=True) + "\n"
147 with open(json_path, "w", encoding="utf-8") as f:
148 f.write(json_string)
File (omitted)\Python312\Lib\json\__init__.py:238, in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
232 if cls is None:
233 cls = JSONEncoder
234 return cls(
235 skipkeys=skipkeys, ensure_ascii=ensure_ascii,
236 check_circular=check_circular, allow_nan=allow_nan, indent=indent,
237 separators=separators, default=default, sort_keys=sort_keys,
--> [238](file:///(omitted)/Python312/Lib/json/__init__.py:238) **kw).encode(obj)
File (omitted)\Python312\Lib\json\encoder.py:202, in JSONEncoder.encode(self, o)
200 chunks = self.iterencode(o, _one_shot=True)
201 if not isinstance(chunks, (list, tuple)):
--> [202](file://(omitted)/Python312/Lib/json/encoder.py:202) chunks = list(chunks)
203 return ''.join(chunks)
File (omitted)\Python312\Lib\json\encoder.py:432, in _make_iterencode.<locals>._iterencode(o, _current_indent_level)
430 yield from _iterencode_list(o, _current_indent_level)
431 elif isinstance(o, dict):
--> [432](file:///(omitted)/Python312/Lib/json/encoder.py:432) yield from _iterencode_dict(o, _current_indent_level)
433 else:
434 if markers is not None:
File (omitted)\Python312\Lib\json\encoder.py:406, in _make_iterencode.<locals>._iterencode_dict(dct, _current_indent_level)
404 else:
405 chunks = _iterencode(value, _current_indent_level)
--> [406](file:///(omitted)/Python312/Lib/json/encoder.py:406) yield from chunks
407 if newline_indent is not None:
408 _current_indent_level -= 1
File (omitted)\Python312\Lib\json\encoder.py:326, in _make_iterencode.<locals>._iterencode_list(lst, _current_indent_level)
324 else:
325 chunks = _iterencode(value, _current_indent_level)
--> [326](file:///(omitted)/Python312/Lib/json/encoder.py:326) yield from chunks
327 if newline_indent is not None:
328 _current_indent_level -= 1
File (omitted)\Python312\Lib\json\encoder.py:406, in _make_iterencode.<locals>._iterencode_dict(dct, _current_indent_level)
404 else:
405 chunks = _iterencode(value, _current_indent_level)
--> [406](file:///(omitted)/Python312/Lib/json/encoder.py:406) yield from chunks
407 if newline_indent is not None:
408 _current_indent_level -= 1
File (omitted)\Python312\Lib\json\encoder.py:439, in _make_iterencode.<locals>._iterencode(o, _current_indent_level)
437 raise ValueError("Circular reference detected")
438 markers[markerid] = o
--> [439](file:///(omitted)/Python312/Lib/json/encoder.py:439) o = _default(o)
440 yield from _iterencode(o, _current_indent_level)
441 if markers is not None:
File (omitted)\Python312\Lib\json\encoder.py:180, in JSONEncoder.default(self, o)
161 def default(self, o):
162 """Implement this method in a subclass such that it returns
163 a serializable object for ``o``, or calls the base implementation
164 (to raise a ``TypeError``).
(...) 178
179 """
--> [180](file:///(omitted)/Python312/Lib/json/encoder.py:180) raise TypeError(f'Object of type {o.__class__.__name__} '
181 f'is not JSON serializable')
TypeError: Object of type ndarray is not JSON serializable
```
</details>
### Example code
Besides the transformers library, the code also uses `datasets`, `evaluate` and `numpy`, but I think, the problem lies in the Trainer implementation, more specifically in the part that saves metrics to a file. When the metrics are not saved, the problem does not occur.
<details>
```python
from datasets import Dataset, Features, ClassLabel, Value
from transformers import AutoModelForSequenceClassification, AutoTokenizer, Trainer, TrainingArguments
from tqdm import tqdm
import evaluate
import numpy as np
model = AutoModelForSequenceClassification.from_pretrained("distilbert/distilbert-base-uncased", num_labels=3)
tokenizer = AutoTokenizer.from_pretrained("distilbert/distilbert-base-uncased")
train_dataset = Dataset.from_list([
{"text": "He often ignores what people say.", "label": 0},
{"text": "The place was dirty and smelled bad.", "label": 0},
{"text": "She went to the store to buy some milk.", "label": 1},
{"text": "The meeting starts at 10 o'clock.", "label": 1},
{"text": "She always helps others when they need it.", "label": 2},
{"text": "He worked hard and reached his goal.", "label": 2},
], Features({"text": Value("string"), "label": ClassLabel(names=["negative", "neutral", "positive"])}))
train_dataset = train_dataset.map(lambda x:tokenizer(x["text"]), batched=True)
test_dataset = Dataset.from_list([
{"text": "She speaks in a very rude way.", "label": 0},
{"text": "He lives in a small town near the river.", "label": 1},
{"text": "The room feels warm and welcoming.", "label": 2},
], Features({"text": Value("string"), "label": ClassLabel(names=["negative", "neutral", "positive"])}))
test_dataset = test_dataset.map(lambda x:tokenizer(x["text"]), batched=True)
f1 = evaluate.load("f1")
def compute_metrics(eval_preds):
logits, labels = eval_preds
predictions = np.argmax(logits, axis=-1)
return f1.compute(predictions=predictions, references=labels, average=None, labels=[0,1,2])
trainer = Trainer(
model,
TrainingArguments(per_device_train_batch_size=1, per_device_eval_batch_size=1, save_total_limit=1, eval_strategy="epoch"),
train_dataset=train_dataset,
eval_dataset=test_dataset,
compute_metrics=compute_metrics,
)
trainer.train()
```
</details>
### Expected behavior
No error. The metrics should be correctly processed. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39616/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39616/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39615 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39615/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39615/comments | https://api.github.com/repos/huggingface/transformers/issues/39615/events | https://github.com/huggingface/transformers/pull/39615 | 3,257,277,038 | PR_kwDOCUB6oc6gUSLF | 39,615 | Add unit test: text-classification pipeline handles empty string | {
"login": "ckakgun",
"id": 25173693,
"node_id": "MDQ6VXNlcjI1MTczNjkz",
"avatar_url": "https://avatars.githubusercontent.com/u/25173693?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ckakgun",
"html_url": "https://github.com/ckakgun",
"followers_url": "https://api.github.com/users/ckakgun/followers",
"following_url": "https://api.github.com/users/ckakgun/following{/other_user}",
"gists_url": "https://api.github.com/users/ckakgun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ckakgun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ckakgun/subscriptions",
"organizations_url": "https://api.github.com/users/ckakgun/orgs",
"repos_url": "https://api.github.com/users/ckakgun/repos",
"events_url": "https://api.github.com/users/ckakgun/events{/privacy}",
"received_events_url": "https://api.github.com/users/ckakgun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T18:10:43 | 2025-07-25T17:22:15 | 2025-07-25T17:22:15 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39615",
"html_url": "https://github.com/huggingface/transformers/pull/39615",
"diff_url": "https://github.com/huggingface/transformers/pull/39615.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39615.patch",
"merged_at": null
} | ## What does this PR do?
This pull request adds a unit test to the `text-classification` pipeline to verify that it can handle an empty string input without raising an error, and returns an empty list. This improves test coverage for an edge case.
## Motivation
Sometimes a user fills empty input into a pipeline by mistake. This test ensures that the pipelines deal with the issue of having an empty string.
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39615/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39615/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39614 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39614/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39614/comments | https://api.github.com/repos/huggingface/transformers/issues/39614/events | https://github.com/huggingface/transformers/pull/39614 | 3,257,234,054 | PR_kwDOCUB6oc6gUI3- | 39,614 | Export SmolvLM | {
"login": "guangy10",
"id": 42389959,
"node_id": "MDQ6VXNlcjQyMzg5OTU5",
"avatar_url": "https://avatars.githubusercontent.com/u/42389959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guangy10",
"html_url": "https://github.com/guangy10",
"followers_url": "https://api.github.com/users/guangy10/followers",
"following_url": "https://api.github.com/users/guangy10/following{/other_user}",
"gists_url": "https://api.github.com/users/guangy10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guangy10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guangy10/subscriptions",
"organizations_url": "https://api.github.com/users/guangy10/orgs",
"repos_url": "https://api.github.com/users/guangy10/repos",
"events_url": "https://api.github.com/users/guangy10/events{/privacy}",
"received_events_url": "https://api.github.com/users/guangy10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T17:52:23 | 2025-08-05T17:18:55 | 2025-08-05T14:20:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39614",
"html_url": "https://github.com/huggingface/transformers/pull/39614",
"diff_url": "https://github.com/huggingface/transformers/pull/39614.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39614.patch",
"merged_at": "2025-08-05T14:20:23"
} | # What does this PR do?
Make SmolvLM exportable to ExecuTorch.
```
tests/models/smolvlm/test_modeling_smolvlm.py::SmolVLMForConditionalGenerationIntegrationTest::test_export_smolvlm_connector PASSED [ 33%]
tests/models/smolvlm/test_modeling_smolvlm.py::SmolVLMForConditionalGenerationIntegrationTest::test_export_smolvlm_text_decoder PASSED [ 66%]
tests/models/smolvlm/test_modeling_smolvlm.py::SmolVLMForConditionalGenerationIntegrationTest::test_export_smolvlm_vision_encoder PASSED [100%]
```
## Before submitting
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker @qubvel
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39614/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39614/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39613 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39613/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39613/comments | https://api.github.com/repos/huggingface/transformers/issues/39613/events | https://github.com/huggingface/transformers/pull/39613 | 3,257,092,976 | PR_kwDOCUB6oc6gTp0Z | 39,613 | Move openai import | {
"login": "ebezzam",
"id": 4757445,
"node_id": "MDQ6VXNlcjQ3NTc0NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4757445?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ebezzam",
"html_url": "https://github.com/ebezzam",
"followers_url": "https://api.github.com/users/ebezzam/followers",
"following_url": "https://api.github.com/users/ebezzam/following{/other_user}",
"gists_url": "https://api.github.com/users/ebezzam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ebezzam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ebezzam/subscriptions",
"organizations_url": "https://api.github.com/users/ebezzam/orgs",
"repos_url": "https://api.github.com/users/ebezzam/repos",
"events_url": "https://api.github.com/users/ebezzam/events{/privacy}",
"received_events_url": "https://api.github.com/users/ebezzam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T16:59:33 | 2025-07-23T17:13:18 | 2025-07-23T17:05:39 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39613",
"html_url": "https://github.com/huggingface/transformers/pull/39613",
"diff_url": "https://github.com/huggingface/transformers/pull/39613.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39613.patch",
"merged_at": "2025-07-23T17:05:39"
} | # What does this PR do?
Moving openai import which causes [failing test](https://app.circleci.com/pipelines/github/huggingface/transformers/139153/workflows/81ebf2c1-8e6b-4250-b673-6008589eb88a/jobs/1843853?utm_campaign=vcs-integration-link&utm_medium=referral&utm_source=github-checks-link&utm_content=summary)
Broke due to https://github.com/huggingface/transformers/pull/39454
cc @ArthurZucker
| {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/users/LysandreJik/followers",
"following_url": "https://api.github.com/users/LysandreJik/following{/other_user}",
"gists_url": "https://api.github.com/users/LysandreJik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LysandreJik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LysandreJik/subscriptions",
"organizations_url": "https://api.github.com/users/LysandreJik/orgs",
"repos_url": "https://api.github.com/users/LysandreJik/repos",
"events_url": "https://api.github.com/users/LysandreJik/events{/privacy}",
"received_events_url": "https://api.github.com/users/LysandreJik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39613/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39613/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39612 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39612/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39612/comments | https://api.github.com/repos/huggingface/transformers/issues/39612/events | https://github.com/huggingface/transformers/pull/39612 | 3,257,077,675 | PR_kwDOCUB6oc6gTma2 | 39,612 | Rework add-new-model-like with modular and make test filenames coherent | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T16:53:52 | 2025-08-04T12:41:10 | 2025-08-04T12:41:09 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39612",
"html_url": "https://github.com/huggingface/transformers/pull/39612",
"diff_url": "https://github.com/huggingface/transformers/pull/39612.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39612.patch",
"merged_at": "2025-08-04T12:41:09"
} | # What does this PR do?
As per the title! Also fixes the image processing auto mapping the same way as we do with tokenizers for slow/fast ones -> it was not coherent
I basically rewrote everything as it was very complicated and out-dated. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39612/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39612/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39611 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39611/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39611/comments | https://api.github.com/repos/huggingface/transformers/issues/39611/events | https://github.com/huggingface/transformers/issues/39611 | 3,257,030,371 | I_kwDOCUB6oc7CIlbj | 39,611 | VoxtralForConditionalGeneration import error | {
"login": "xinkez",
"id": 13937138,
"node_id": "MDQ6VXNlcjEzOTM3MTM4",
"avatar_url": "https://avatars.githubusercontent.com/u/13937138?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xinkez",
"html_url": "https://github.com/xinkez",
"followers_url": "https://api.github.com/users/xinkez/followers",
"following_url": "https://api.github.com/users/xinkez/following{/other_user}",
"gists_url": "https://api.github.com/users/xinkez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xinkez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xinkez/subscriptions",
"organizations_url": "https://api.github.com/users/xinkez/orgs",
"repos_url": "https://api.github.com/users/xinkez/repos",
"events_url": "https://api.github.com/users/xinkez/events{/privacy}",
"received_events_url": "https://api.github.com/users/xinkez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
},
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-23T16:37:59 | 2025-07-24T12:34:07 | 2025-07-24T12:34:06 | NONE | null | null | null | null | ### System Info
Hi,
I use the latest codes from the main branch (install by "pip install git+https://github.com/huggingface/transformers"). When I try to import VoxtralForConditionalGeneration using "from transformers import VoxtralForConditionalGeneration", it outputs errors as below
from transformers import VoxtralForConditionalGeneration, AutoProcessor
ImportError: cannot import name 'VoxtralForConditionalGeneration' from 'transformers'
Thanks in advance.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
1. pip install git+https://github.com/huggingface/transformers"
2. python3
3. from transformers import VoxtralForConditionalGeneration
### Expected behavior
Success | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39611/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39611/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39610 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39610/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39610/comments | https://api.github.com/repos/huggingface/transformers/issues/39610/events | https://github.com/huggingface/transformers/pull/39610 | 3,256,803,393 | PR_kwDOCUB6oc6gSqLe | 39,610 | Fix return typehint for decoder and annotate inv_freq | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 8882772041,
"node_id": "LA_kwDOCUB6oc8AAAACEXRYSQ",
"url": "https://api.github.com/repos/huggingface/transformers/labels/typing",
"name": "typing",
"color": "DBA272",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-23T15:21:46 | 2025-08-07T13:10:22 | 2025-08-07T13:10:22 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39610",
"html_url": "https://github.com/huggingface/transformers/pull/39610",
"diff_url": "https://github.com/huggingface/transformers/pull/39610.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39610.patch",
"merged_at": "2025-08-07T13:10:22"
} | # What does this PR do?
Fix return type hint for decoder layer and annotate inv_freq | {
"login": "qubvel",
"id": 31920396,
"node_id": "MDQ6VXNlcjMxOTIwMzk2",
"avatar_url": "https://avatars.githubusercontent.com/u/31920396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qubvel",
"html_url": "https://github.com/qubvel",
"followers_url": "https://api.github.com/users/qubvel/followers",
"following_url": "https://api.github.com/users/qubvel/following{/other_user}",
"gists_url": "https://api.github.com/users/qubvel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qubvel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qubvel/subscriptions",
"organizations_url": "https://api.github.com/users/qubvel/orgs",
"repos_url": "https://api.github.com/users/qubvel/repos",
"events_url": "https://api.github.com/users/qubvel/events{/privacy}",
"received_events_url": "https://api.github.com/users/qubvel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39610/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39610/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39609 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39609/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39609/comments | https://api.github.com/repos/huggingface/transformers/issues/39609/events | https://github.com/huggingface/transformers/pull/39609 | 3,256,649,038 | PR_kwDOCUB6oc6gSIF3 | 39,609 | Chat schemas | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T14:36:54 | 2025-09-15T17:22:10 | 2025-09-15T17:10:53 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39609",
"html_url": "https://github.com/huggingface/transformers/pull/39609",
"diff_url": "https://github.com/huggingface/transformers/pull/39609.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39609.patch",
"merged_at": null
} | This is an experimental PR to get feedback on a new potential feature called **Chat Schemas**.
### What problem does this fix?
Since the arrival of chat templates, I've gotten a lot of requests for the same two features:
1) People want a way to detect which inputs a template supports. For example, does it support system messages or tools? Do the tools need special formatting, or is the default okay?
2) People want a way to parse model outputs, especially when the model calls a tool or has thinking blocks. Ideally, people want a way to turn an entire formatted conversation back into a list of messages, tool defs, tool calls, etc.
Right now, people handle these in hacky ways. For example, some code searches the template for references to "tools" to decide if it supports tools or not. Other frameworks use hardcoded functions to infer some common tool call formats and parse them.
### What's the solution?
Models can have a **chat schema** alongside the **chat template**. This is a pure JSON file, containing a JSON schema representing the model's input format. For example, a simple chat schema for a model that only supports messages, not tools, might look like this:
```json
{
"type": "array",
"items": {
"type": "object",
"properties": {
"role": {"type": "string"},
"content": {"type": "string"}
},
"required": ["role", "content"]
},
}
```
There's a twist, though: We allow an [extra field](https://json-schema.org/blog/posts/custom-annotations-will-continue) in the schema: `x-regex`. This specifies the regex used to **extract this schema node**, optionally with named groups that indicate how to extract child nodes as well. For example, for a simple model with ChatML formatting, the regex could be:
```
r"<\|im_start\|>(?P<role>.*?)\n(?P<content>.*?)<\|im_end\|>\n"
```
Using this schema and the regex(es), we can walk the schema and formatted output recursively and reconstruct the original model inputs.
### What do we get?
This resolves both of the long-standing demands: We now have a way to parse formatted chats back to lists of messages. We can also parse tool calls! This means that if models have chat schemas, they can be used in a **universal API that doesn't require any model-specific tool parsing**. This has been a major weakness in chat templates since I made them!
### What are the downsides?
The main downside is that, like with chat templates, someone has to actually add these schemas to models! The base schema is easy enough to write, but the regexes may be harder for complex tool calling models. However, in testing, they weren't too bad. I think writing a chat schema is a lot less work than writing a chat template, especially since you can usually copy the entire schema from another model and just tweak the regexes a little.
### Work still to do in this PR
- [ ] Add a lot more test coverage (50% done)
- [ ] Make sure we can parse tool defs as well as tool calls, and add more formats!
- [x] Add JSON parser
- [x] Add Python type / tool def parser
- [ ] Add code to tokenizers to load / save chat schemas
- [ ] Add code to pipelines for output / tool call parsing?
- [ ] Overhaul chat template docs to include chat schemas too
- [x] Add code for offset extraction
- [x] Use offsets for assistant turn masking
- [ ] Cleanup code TODOs and remove very insecure `eval` calls
Fixes #40776 | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39609/reactions",
"total_count": 9,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 3,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
} | https://api.github.com/repos/huggingface/transformers/issues/39609/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39608 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39608/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39608/comments | https://api.github.com/repos/huggingface/transformers/issues/39608/events | https://github.com/huggingface/transformers/issues/39608 | 3,256,547,440 | I_kwDOCUB6oc7CGvhw | 39,608 | Qwen3 Fails w/4D Attn Mask when using FA2 | {
"login": "ntenenz",
"id": 8411908,
"node_id": "MDQ6VXNlcjg0MTE5MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8411908?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ntenenz",
"html_url": "https://github.com/ntenenz",
"followers_url": "https://api.github.com/users/ntenenz/followers",
"following_url": "https://api.github.com/users/ntenenz/following{/other_user}",
"gists_url": "https://api.github.com/users/ntenenz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ntenenz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ntenenz/subscriptions",
"organizations_url": "https://api.github.com/users/ntenenz/orgs",
"repos_url": "https://api.github.com/users/ntenenz/repos",
"events_url": "https://api.github.com/users/ntenenz/events{/privacy}",
"received_events_url": "https://api.github.com/users/ntenenz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-23T14:07:58 | 2025-08-31T08:02:52 | 2025-08-31T08:02:52 | NONE | null | null | null | null | ### System Info
* transformers: 4.53.3
* torch 2.6.0
* flash-attn 2.7.4.post1
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
MODEL_PATH="Qwen/Qwen3-4B-Base"
DEVICE = "cuda:0"
TOT_LEN = 2048
SEQ_LEN = 512
tok = AutoTokenizer.from_pretrained(MODEL_PATH)
model = AutoModelForCausalLM.from_pretrained(MODEL_PATH, torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2")
_ = model.to(DEVICE).eval()
toks = torch.arange(TOT_LEN, device=DEVICE)[None, :]
pos_ids = torch.cat([torch.arange(SEQ_LEN, device=DEVICE) for _ in range(TOT_LEN // SEQ_LEN)])[None, :]
mask = torch.block_diag(*[torch.tril(torch.ones((SEQ_LEN, SEQ_LEN), dtype=torch.bool, device=DEVICE)) for _ in range(TOT_LEN // SEQ_LEN)])[None, None, ...]
print(f"token shape: {toks.shape}")
print(f"pos_id shape: {pos_ids.shape}")
print(f"mask shape: {mask.shape}")
with torch.no_grad():
model(toks, position_ids=pos_ids, attention_mask=mask)
```
output prior to error:
> token shape: torch.Size([1, 2048])
pos_id shape: torch.Size([1, 2048])
mask shape: torch.Size([1, 1, 2048, 2048])
error
> 147 @_torch_custom_op_wrapper("flash_attn::_flash_attn_varlen_forward", mutates_args=(), device_types="cuda")
148 def _flash_attn_varlen_forward(
149 q: torch.Tensor,
(...) 167 zero_tensors: bool = False,
168 ) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]:
169 q, k, v = [maybe_contiguous(x) for x in (q, k, v)]
--> 170 out, softmax_lse, S_dmask, rng_state = flash_attn_gpu.varlen_fwd(
171 q,
172 k,
173 v,
174 None,
175 cu_seqlens_q,
176 cu_seqlens_k,
177 seqused_k,
178 leftpad_k,
179 block_table,
180 alibi_slopes,
181 max_seqlen_q,
182 max_seqlen_k,
183 dropout_p,
184 softmax_scale,
185 zero_tensors,
186 causal,
187 window_size_left,
188 window_size_right,
189 softcap,
190 return_softmax,
191 None,
192 )
193 # if out.isnan().any() or softmax_lse.isnan().any():
194 # breakpoint()
195 return out, softmax_lse, S_dmask, rng_state
RuntimeError: cu_seqlens_q must have shape (batch_size + 1)
### Expected behavior
Model should successfully perform a forward pass. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39608/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39608/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39607 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39607/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39607/comments | https://api.github.com/repos/huggingface/transformers/issues/39607/events | https://github.com/huggingface/transformers/issues/39607 | 3,256,417,423 | I_kwDOCUB6oc7CGPyP | 39,607 | ImageClassificationPipeline preprocess should accept numpy/tensor arrays | {
"login": "idantene",
"id": 12184618,
"node_id": "MDQ6VXNlcjEyMTg0NjE4",
"avatar_url": "https://avatars.githubusercontent.com/u/12184618?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/idantene",
"html_url": "https://github.com/idantene",
"followers_url": "https://api.github.com/users/idantene/followers",
"following_url": "https://api.github.com/users/idantene/following{/other_user}",
"gists_url": "https://api.github.com/users/idantene/gists{/gist_id}",
"starred_url": "https://api.github.com/users/idantene/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/idantene/subscriptions",
"organizations_url": "https://api.github.com/users/idantene/orgs",
"repos_url": "https://api.github.com/users/idantene/repos",
"events_url": "https://api.github.com/users/idantene/events{/privacy}",
"received_events_url": "https://api.github.com/users/idantene/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-07-23T13:30:32 | 2025-07-25T17:15:13 | null | NONE | null | null | null | null | ### Feature request
Currently, `ImageClassificationPipeline` expects a PIL image or a string pointing to a URL.
This makes using existing datasets (e.g. from `torchvision`) a bit more difficult in the generic case. A non-transformers model would work with torch tensors, but not necessarily with a PIL image, whereas `ImageClassificationPipeline` won't work with torch tensors.
Under the hood, it seems to me that the image_processor should determine what inputs it supports -- I'm not sure why the `load_image` is hard-coded and/or does not support other well-known types.
### Motivation
I have a factory method to generate `pipelines` or use `torchvision` models, and the two are not currently interchangeable without having to monkey-patch that preprocess function (or provide a wrapper class for `ImageClassificationPipeline` that does not use the `load_image`) function.
### Your contribution
I could make the changes to the code if needed. Adding support is trivial. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39607/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39607/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39606 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39606/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39606/comments | https://api.github.com/repos/huggingface/transformers/issues/39606/events | https://github.com/huggingface/transformers/pull/39606 | 3,256,372,921 | PR_kwDOCUB6oc6gRLgS | 39,606 | HunYuan opensource | {
"login": "yjc9696",
"id": 32888676,
"node_id": "MDQ6VXNlcjMyODg4Njc2",
"avatar_url": "https://avatars.githubusercontent.com/u/32888676?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yjc9696",
"html_url": "https://github.com/yjc9696",
"followers_url": "https://api.github.com/users/yjc9696/followers",
"following_url": "https://api.github.com/users/yjc9696/following{/other_user}",
"gists_url": "https://api.github.com/users/yjc9696/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yjc9696/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yjc9696/subscriptions",
"organizations_url": "https://api.github.com/users/yjc9696/orgs",
"repos_url": "https://api.github.com/users/yjc9696/repos",
"events_url": "https://api.github.com/users/yjc9696/events{/privacy}",
"received_events_url": "https://api.github.com/users/yjc9696/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-23T13:18:37 | 2025-08-22T08:00:37 | 2025-08-22T07:59:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39606",
"html_url": "https://github.com/huggingface/transformers/pull/39606",
"diff_url": "https://github.com/huggingface/transformers/pull/39606.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39606.patch",
"merged_at": "2025-08-22T07:59:58"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
This PR primarily aims to add support for the Hunyuan series of models in inference. We noticed that previous Hunyuan models relied on trust_remote_code for inference, which makes version maintenance difficult and often leads to outdated inference code. To address this, we are integrating the inference code into the Transformers library to support continuous updates for future open-source releases.
The submitted code includes the inference implementations for both hunyuan_v1_dense and hunyuan_v1_moe, along with their corresponding configurations and tokenizers.
For unit testing, we added a single-sample test for the hunyuan_v1_moe model using tencent/Hunyuan-A13B-Instruct. Unfortunately, the hunyuan_v1_dense model is not yet officially open-sourced, so we currently lack a testable model for it,we will update upon model release.
This is my first PR submission. After carefully studying the Contribute to 🤗 Transformers guide, I've modified my code to pass all make fixup checks.
I'd greatly appreciate any feedback if additional changes or improvements are needed - please don't hesitate to point them out!
## Before submitting
- [Y] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [Y] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39606/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39606/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39605 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39605/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39605/comments | https://api.github.com/repos/huggingface/transformers/issues/39605/events | https://github.com/huggingface/transformers/pull/39605 | 3,256,252,379 | PR_kwDOCUB6oc6gQw81 | 39,605 | [Voxtral] values for A10 runners | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T12:44:51 | 2025-07-24T16:52:35 | 2025-07-24T16:52:35 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39605",
"html_url": "https://github.com/huggingface/transformers/pull/39605",
"diff_url": "https://github.com/huggingface/transformers/pull/39605.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39605.patch",
"merged_at": "2025-07-24T16:52:35"
} | # What does this PR do?
Values where computed on H100, here are the A10 ones for our CI runners! | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39605/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39605/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39604 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39604/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39604/comments | https://api.github.com/repos/huggingface/transformers/issues/39604/events | https://github.com/huggingface/transformers/pull/39604 | 3,256,149,907 | PR_kwDOCUB6oc6gQaIu | 39,604 | chore: update cohere2 (Command R7B) model card | {
"login": "arpon-kapuria",
"id": 83688431,
"node_id": "MDQ6VXNlcjgzNjg4NDMx",
"avatar_url": "https://avatars.githubusercontent.com/u/83688431?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arpon-kapuria",
"html_url": "https://github.com/arpon-kapuria",
"followers_url": "https://api.github.com/users/arpon-kapuria/followers",
"following_url": "https://api.github.com/users/arpon-kapuria/following{/other_user}",
"gists_url": "https://api.github.com/users/arpon-kapuria/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arpon-kapuria/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arpon-kapuria/subscriptions",
"organizations_url": "https://api.github.com/users/arpon-kapuria/orgs",
"repos_url": "https://api.github.com/users/arpon-kapuria/repos",
"events_url": "https://api.github.com/users/arpon-kapuria/events{/privacy}",
"received_events_url": "https://api.github.com/users/arpon-kapuria/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T12:17:16 | 2025-07-31T16:47:31 | 2025-07-30T15:34:27 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39604",
"html_url": "https://github.com/huggingface/transformers/pull/39604",
"diff_url": "https://github.com/huggingface/transformers/pull/39604.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39604.patch",
"merged_at": "2025-07-30T15:34:27"
} | # What does this PR do?
This PR updates the model card for Cohere2, following the template outlined in the issue.
## Before submitting
- [x] This PR improves the docs.
## Who can review?
@stevhliu
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39604/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39604/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39603 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39603/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39603/comments | https://api.github.com/repos/huggingface/transformers/issues/39603/events | https://github.com/huggingface/transformers/pull/39603 | 3,255,990,361 | PR_kwDOCUB6oc6gP2vW | 39,603 | feat: add `is_fast` to ImageProcessor | {
"login": "MilkClouds",
"id": 26109705,
"node_id": "MDQ6VXNlcjI2MTA5NzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MilkClouds",
"html_url": "https://github.com/MilkClouds",
"followers_url": "https://api.github.com/users/MilkClouds/followers",
"following_url": "https://api.github.com/users/MilkClouds/following{/other_user}",
"gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions",
"organizations_url": "https://api.github.com/users/MilkClouds/orgs",
"repos_url": "https://api.github.com/users/MilkClouds/repos",
"events_url": "https://api.github.com/users/MilkClouds/events{/privacy}",
"received_events_url": "https://api.github.com/users/MilkClouds/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T11:27:25 | 2025-08-12T12:15:14 | 2025-08-12T12:14:57 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39603",
"html_url": "https://github.com/huggingface/transformers/pull/39603",
"diff_url": "https://github.com/huggingface/transformers/pull/39603.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39603.patch",
"merged_at": "2025-08-12T12:14:57"
} | # What does this PR do?
Tokenizer has `is_fast` but ImageProcessor is not. I added simple `is_fast` property to ImageProcessor.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review?
@zucchini-nlp may able to review this!
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39603/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39603/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39602 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39602/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39602/comments | https://api.github.com/repos/huggingface/transformers/issues/39602/events | https://github.com/huggingface/transformers/pull/39602 | 3,255,958,573 | PR_kwDOCUB6oc6gPvlh | 39,602 | feat: add `is_fast` to ImageProcessor | {
"login": "MilkClouds",
"id": 26109705,
"node_id": "MDQ6VXNlcjI2MTA5NzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MilkClouds",
"html_url": "https://github.com/MilkClouds",
"followers_url": "https://api.github.com/users/MilkClouds/followers",
"following_url": "https://api.github.com/users/MilkClouds/following{/other_user}",
"gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions",
"organizations_url": "https://api.github.com/users/MilkClouds/orgs",
"repos_url": "https://api.github.com/users/MilkClouds/repos",
"events_url": "https://api.github.com/users/MilkClouds/events{/privacy}",
"received_events_url": "https://api.github.com/users/MilkClouds/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T11:17:37 | 2025-07-23T11:25:26 | 2025-07-23T11:25:25 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39602",
"html_url": "https://github.com/huggingface/transformers/pull/39602",
"diff_url": "https://github.com/huggingface/transformers/pull/39602.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39602.patch",
"merged_at": null
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "MilkClouds",
"id": 26109705,
"node_id": "MDQ6VXNlcjI2MTA5NzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/26109705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MilkClouds",
"html_url": "https://github.com/MilkClouds",
"followers_url": "https://api.github.com/users/MilkClouds/followers",
"following_url": "https://api.github.com/users/MilkClouds/following{/other_user}",
"gists_url": "https://api.github.com/users/MilkClouds/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MilkClouds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MilkClouds/subscriptions",
"organizations_url": "https://api.github.com/users/MilkClouds/orgs",
"repos_url": "https://api.github.com/users/MilkClouds/repos",
"events_url": "https://api.github.com/users/MilkClouds/events{/privacy}",
"received_events_url": "https://api.github.com/users/MilkClouds/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39602/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39602/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39601 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39601/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39601/comments | https://api.github.com/repos/huggingface/transformers/issues/39601/events | https://github.com/huggingface/transformers/pull/39601 | 3,255,888,169 | PR_kwDOCUB6oc6gPfyx | 39,601 | [`CI`] Add Eric to comment slow ci | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T10:54:38 | 2025-07-28T13:24:01 | 2025-07-28T13:24:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39601",
"html_url": "https://github.com/huggingface/transformers/pull/39601",
"diff_url": "https://github.com/huggingface/transformers/pull/39601.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39601.patch",
"merged_at": "2025-07-28T13:24:00"
} | As per title cc @ebezzam @eustlb | {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39601/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39600 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39600/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39600/comments | https://api.github.com/repos/huggingface/transformers/issues/39600/events | https://github.com/huggingface/transformers/pull/39600 | 3,255,714,164 | PR_kwDOCUB6oc6gO5PE | 39,600 | [video processors] decode only sampled videos -> less RAM and faster processing | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T09:55:59 | 2025-08-26T09:38:03 | 2025-08-26T09:38:03 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39600",
"html_url": "https://github.com/huggingface/transformers/pull/39600",
"diff_url": "https://github.com/huggingface/transformers/pull/39600.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39600.patch",
"merged_at": "2025-08-26T09:38:03"
} | # What does this PR do?
This PR moves the video decoding code entirely into video processors, so that we can load only necessary video frames into memory. To be consistent with video processors, I also updated image processors to accept `str` in inputs and optionally load images.
The docs for video processors are also updated explaining how frames are sampled and what users need to do to turn it on/off. Note that we'll be using by default `torchcodec` and fallback to `torchvision`, and we won't support any arbitrary video decoders within video processor class. Otherwise we'd need to introduce more `kwargs` and handle differences between decoders, which bloats up the code even more
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39600/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39600/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39599 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39599/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39599/comments | https://api.github.com/repos/huggingface/transformers/issues/39599/events | https://github.com/huggingface/transformers/pull/39599 | 3,255,669,720 | PR_kwDOCUB6oc6gOvaI | 39,599 | Fix: check TrainerState file exists before loading during resume | {
"login": "Petecheco",
"id": 92204815,
"node_id": "U_kgDOBX7vDw",
"avatar_url": "https://avatars.githubusercontent.com/u/92204815?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Petecheco",
"html_url": "https://github.com/Petecheco",
"followers_url": "https://api.github.com/users/Petecheco/followers",
"following_url": "https://api.github.com/users/Petecheco/following{/other_user}",
"gists_url": "https://api.github.com/users/Petecheco/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Petecheco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Petecheco/subscriptions",
"organizations_url": "https://api.github.com/users/Petecheco/orgs",
"repos_url": "https://api.github.com/users/Petecheco/repos",
"events_url": "https://api.github.com/users/Petecheco/events{/privacy}",
"received_events_url": "https://api.github.com/users/Petecheco/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-23T09:40:54 | 2025-07-23T09:45:44 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39599",
"html_url": "https://github.com/huggingface/transformers/pull/39599",
"diff_url": "https://github.com/huggingface/transformers/pull/39599.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39599.patch",
"merged_at": null
} | # What does this PR do?
When resuming training from a checkpoint, the Trainer attempts to load \`trainer_state.json\` to recover the train batch size. However, if the file does not exist, a \`FileNotFoundError\` is raised, causing resume to fail.
This patch adds a check using \`os.path.isfile\` before loading the state, consistent with other parts of the codebase. If the file is missing, a warning is logged and batch size recovery is skipped, allowing training to continue.
Fixes a potential crash and improves robustness when resuming from incomplete or custom checkpoint directories.
Related PR #27568
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
trainer: @zach-huggingface, @SunMarc and @qgallouedec
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39599/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39599/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39598 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39598/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39598/comments | https://api.github.com/repos/huggingface/transformers/issues/39598/events | https://github.com/huggingface/transformers/pull/39598 | 3,255,414,519 | PR_kwDOCUB6oc6gN4RV | 39,598 | Fix typos and grammar issues in documentation and code | {
"login": "cluster2600",
"id": 69890511,
"node_id": "MDQ6VXNlcjY5ODkwNTEx",
"avatar_url": "https://avatars.githubusercontent.com/u/69890511?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cluster2600",
"html_url": "https://github.com/cluster2600",
"followers_url": "https://api.github.com/users/cluster2600/followers",
"following_url": "https://api.github.com/users/cluster2600/following{/other_user}",
"gists_url": "https://api.github.com/users/cluster2600/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cluster2600/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cluster2600/subscriptions",
"organizations_url": "https://api.github.com/users/cluster2600/orgs",
"repos_url": "https://api.github.com/users/cluster2600/repos",
"events_url": "https://api.github.com/users/cluster2600/events{/privacy}",
"received_events_url": "https://api.github.com/users/cluster2600/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T08:22:25 | 2025-07-23T12:43:47 | 2025-07-23T12:43:12 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39598",
"html_url": "https://github.com/huggingface/transformers/pull/39598",
"diff_url": "https://github.com/huggingface/transformers/pull/39598.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39598.patch",
"merged_at": "2025-07-23T12:43:12"
} | # What does this PR do?
Summary
- Fixed Cyrillic 'Р' to Latin 'P' in Portuguese language link (README.md)
- Fixed 'meanginful' to 'meaningful' in training documentation
- Fixed duplicate 'Cohere' reference in modular transformers documentation
- Fixed duplicate 'the the' in trainer and chat command comments
Test plan
- Verified all typos are corrected
- Ensured no functional changes to code behavior
- Maintained existing formatting and structure
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39598/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39598/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39597 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39597/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39597/comments | https://api.github.com/repos/huggingface/transformers/issues/39597/events | https://github.com/huggingface/transformers/issues/39597 | 3,254,785,900 | I_kwDOCUB6oc7CABds | 39,597 | The similarity between image and text in siglip2 is very low | {
"login": "sigma-alpha-beta",
"id": 5878481,
"node_id": "MDQ6VXNlcjU4Nzg0ODE=",
"avatar_url": "https://avatars.githubusercontent.com/u/5878481?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sigma-alpha-beta",
"html_url": "https://github.com/sigma-alpha-beta",
"followers_url": "https://api.github.com/users/sigma-alpha-beta/followers",
"following_url": "https://api.github.com/users/sigma-alpha-beta/following{/other_user}",
"gists_url": "https://api.github.com/users/sigma-alpha-beta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sigma-alpha-beta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigma-alpha-beta/subscriptions",
"organizations_url": "https://api.github.com/users/sigma-alpha-beta/orgs",
"repos_url": "https://api.github.com/users/sigma-alpha-beta/repos",
"events_url": "https://api.github.com/users/sigma-alpha-beta/events{/privacy}",
"received_events_url": "https://api.github.com/users/sigma-alpha-beta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T04:12:41 | 2025-07-23T12:24:58 | 2025-07-23T12:24:58 | NONE | null | null | null | null | transformers:4.53.2
I use the following code to test the similarity between image and text, but the similarity is only 0.37, the output is `tensor([[3.7129e-01, 4.2452e-04, 1.8075e-07]])`
code:
```
from PIL import Image
import requests
from transformers import AutoProcessor, AutoModel
import torch
model = AutoModel.from_pretrained("google/siglip2-so400m-patch16-256")
processor = AutoProcessor.from_pretrained("google/siglip2-so400m-patch16-256")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
# image = Image.open("cat.png").convert('RGB')
text = ["two cats", "a cat", "a dog"]
inputs = processor(text=text, images=image, padding="max_length", max_length=64, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
logits_per_image = outputs.logits_per_image
probs = torch.sigmoid(logits_per_image)
print(probs)
``` | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39597/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39597/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39596 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39596/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39596/comments | https://api.github.com/repos/huggingface/transformers/issues/39596/events | https://github.com/huggingface/transformers/issues/39596 | 3,254,621,947 | I_kwDOCUB6oc7B_Zb7 | 39,596 | Does transformers support python3.13 -- disable-gil or python3.14 free threading? | {
"login": "SoulH-qqq",
"id": 54933169,
"node_id": "MDQ6VXNlcjU0OTMzMTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/54933169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SoulH-qqq",
"html_url": "https://github.com/SoulH-qqq",
"followers_url": "https://api.github.com/users/SoulH-qqq/followers",
"following_url": "https://api.github.com/users/SoulH-qqq/following{/other_user}",
"gists_url": "https://api.github.com/users/SoulH-qqq/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SoulH-qqq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SoulH-qqq/subscriptions",
"organizations_url": "https://api.github.com/users/SoulH-qqq/orgs",
"repos_url": "https://api.github.com/users/SoulH-qqq/repos",
"events_url": "https://api.github.com/users/SoulH-qqq/events{/privacy}",
"received_events_url": "https://api.github.com/users/SoulH-qqq/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T02:34:03 | 2025-08-30T08:02:54 | 2025-08-30T08:02:54 | NONE | null | null | null | null | Does transformers support python3.13 -- disable-gil or python3.14 free threading?
I got an error when trying to install transformers on these two python versions. | {
"login": "github-actions[bot]",
"id": 41898282,
"node_id": "MDM6Qm90NDE4OTgyODI=",
"avatar_url": "https://avatars.githubusercontent.com/in/15368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/github-actions%5Bbot%5D",
"html_url": "https://github.com/apps/github-actions",
"followers_url": "https://api.github.com/users/github-actions%5Bbot%5D/followers",
"following_url": "https://api.github.com/users/github-actions%5Bbot%5D/following{/other_user}",
"gists_url": "https://api.github.com/users/github-actions%5Bbot%5D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/github-actions%5Bbot%5D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/github-actions%5Bbot%5D/subscriptions",
"organizations_url": "https://api.github.com/users/github-actions%5Bbot%5D/orgs",
"repos_url": "https://api.github.com/users/github-actions%5Bbot%5D/repos",
"events_url": "https://api.github.com/users/github-actions%5Bbot%5D/events{/privacy}",
"received_events_url": "https://api.github.com/users/github-actions%5Bbot%5D/received_events",
"type": "Bot",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39596/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39596/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39595 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39595/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39595/comments | https://api.github.com/repos/huggingface/transformers/issues/39595/events | https://github.com/huggingface/transformers/pull/39595 | 3,254,411,280 | PR_kwDOCUB6oc6gKhMz | 39,595 | [Trackio] Allow single-gpu training and monitor power | {
"login": "qgallouedec",
"id": 45557362,
"node_id": "MDQ6VXNlcjQ1NTU3MzYy",
"avatar_url": "https://avatars.githubusercontent.com/u/45557362?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qgallouedec",
"html_url": "https://github.com/qgallouedec",
"followers_url": "https://api.github.com/users/qgallouedec/followers",
"following_url": "https://api.github.com/users/qgallouedec/following{/other_user}",
"gists_url": "https://api.github.com/users/qgallouedec/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qgallouedec/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qgallouedec/subscriptions",
"organizations_url": "https://api.github.com/users/qgallouedec/orgs",
"repos_url": "https://api.github.com/users/qgallouedec/repos",
"events_url": "https://api.github.com/users/qgallouedec/events{/privacy}",
"received_events_url": "https://api.github.com/users/qgallouedec/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-23T00:11:19 | 2025-07-23T14:29:33 | 2025-07-23T09:22:50 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39595",
"html_url": "https://github.com/huggingface/transformers/pull/39595",
"diff_url": "https://github.com/huggingface/transformers/pull/39595.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39595.patch",
"merged_at": "2025-07-23T09:22:50"
} | # What does this PR do?
<img width="1146" height="962" alt="Screenshot 2025-07-22 at 5 18 42 PM" src="https://github.com/user-attachments/assets/45fd0d90-a150-4bf4-9a26-c30b49449b1a" />
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39595/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39595/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39594 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39594/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39594/comments | https://api.github.com/repos/huggingface/transformers/issues/39594/events | https://github.com/huggingface/transformers/pull/39594 | 3,254,118,648 | PR_kwDOCUB6oc6gJhpB | 39,594 | 🌐 [i18n-KO] Translated 'xclip.md' to Korean | {
"login": "ssum21",
"id": 116950962,
"node_id": "U_kgDOBviHsg",
"avatar_url": "https://avatars.githubusercontent.com/u/116950962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ssum21",
"html_url": "https://github.com/ssum21",
"followers_url": "https://api.github.com/users/ssum21/followers",
"following_url": "https://api.github.com/users/ssum21/following{/other_user}",
"gists_url": "https://api.github.com/users/ssum21/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ssum21/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssum21/subscriptions",
"organizations_url": "https://api.github.com/users/ssum21/orgs",
"repos_url": "https://api.github.com/users/ssum21/repos",
"events_url": "https://api.github.com/users/ssum21/events{/privacy}",
"received_events_url": "https://api.github.com/users/ssum21/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T21:25:09 | 2025-09-08T18:19:10 | 2025-09-08T18:19:10 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39594",
"html_url": "https://github.com/huggingface/transformers/pull/39594",
"diff_url": "https://github.com/huggingface/transformers/pull/39594.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39594.patch",
"merged_at": "2025-09-08T18:19:10"
} | # What does this PR do?
Translated the `xclip.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39594/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39594/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39593 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39593/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39593/comments | https://api.github.com/repos/huggingface/transformers/issues/39593/events | https://github.com/huggingface/transformers/issues/39593 | 3,253,888,017 | I_kwDOCUB6oc7B8mQR | 39,593 | `gemma-3-1b-it` with `use_cache=True` and `past_key_values` throws `RuntimeError: CUDA error: device-side assert` error | {
"login": "nickeisenberg",
"id": 64921400,
"node_id": "MDQ6VXNlcjY0OTIxNDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/64921400?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nickeisenberg",
"html_url": "https://github.com/nickeisenberg",
"followers_url": "https://api.github.com/users/nickeisenberg/followers",
"following_url": "https://api.github.com/users/nickeisenberg/following{/other_user}",
"gists_url": "https://api.github.com/users/nickeisenberg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nickeisenberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nickeisenberg/subscriptions",
"organizations_url": "https://api.github.com/users/nickeisenberg/orgs",
"repos_url": "https://api.github.com/users/nickeisenberg/repos",
"events_url": "https://api.github.com/users/nickeisenberg/events{/privacy}",
"received_events_url": "https://api.github.com/users/nickeisenberg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-22T19:49:29 | 2025-07-24T03:19:04 | 2025-07-24T03:19:04 | NONE | null | null | null | null | ### System Info
(dev) nicholas@B306177:chatbot-utils(master)$ transformers env
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.52.4
- Platform: Linux-5.15.167.4-microsoft-standard-WSL2-x86_64-with-glibc2.39
- Python version: 3.11.12
- Huggingface_hub version: 0.33.0
- Safetensors version: 0.5.3
- Accelerate version: 1.8.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.7.0+cu126 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA RTX 6000 Ada Generation
(dev) nicholas@B306177:chatbot-utils(master)$
### Who can help?
@ArthurZucker
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [x] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
I am trying to use `gemma-3-1b-it` with `use_cache=True`. The following snippet of code runs perfectly fine and does not set `use_cache=true`.
```python
from transformers.cache_utils import HybridCache
from transformers.models.auto.modeling_auto import AutoModelForCausalLM
from transformers.models.auto.tokenization_auto import AutoTokenizer
from transformers.models.gemma.tokenization_gemma_fast import (
GemmaTokenizerFast,
)
from transformers.models.gemma3.modeling_gemma3 import Gemma3ForCausalLM
import torch
def stream(
model: Gemma3ForCausalLM, tokenizer: GemmaTokenizerFast, prompt: str
):
input_ids = tokenizer.encode(prompt)
input_ids = torch.tensor(
input_ids, device=model.device, dtype=torch.long
).unsqueeze(0)
attention_mask = torch.ones_like(
input_ids, device=model.device, dtype=torch.long
)
eos_token_id = [tokenizer.eos_token_id, 106]
for _ in range(100):
with torch.no_grad():
outputs = model.forward(
input_ids=input_ids, # type: ignore
attention_mask=attention_mask,
use_cache=False,
)
logits = outputs.logits
assert logits is not None
next_token = torch.argmax(logits[:, -1, :], dim=-1, keepdim=True)
token_id = next_token.item()
if eos_token_id is not None and token_id in eos_token_id:
break
print(tokenizer.decode(token_id), end="", flush=True)
while len(next_token.shape) < len(input_ids.shape):
next_token = next_token.unsqueeze(0)
input_ids = torch.concat((input_ids, next_token), dim=-1)
attention_mask = torch.ones_like(
input_ids, device=model.device, dtype=torch.long
)
print()
stream(model, tokenizer, "How do I add two ints in python?. Give a short answer.")
# Here is the response
'''python
a = 10
b = 20
sum = a + b
print(sum)
'''
Output:
30
The code adds the integers `a` and `b` and stores the result in the variable `sum`. Finally, it prints the value of `sum`.
```
However, if I try to set `use_cache=True` and use the `past_key_values`, we get the following error.
```python
from transformers.cache_utils import HybridCache
from transformers.models.auto.modeling_auto import AutoModelForCausalLM
from transformers.models.auto.tokenization_auto import AutoTokenizer
from transformers.models.gemma.tokenization_gemma_fast import (
GemmaTokenizerFast,
)
from transformers.models.gemma3.modeling_gemma3 import Gemma3ForCausalLM
import torch
def stream_with_cache(
model: Gemma3ForCausalLM, tokenizer: GemmaTokenizerFast, prompt: str
):
input_ids = tokenizer.encode(prompt)
input_ids = torch.tensor(
input_ids, device=model.device, dtype=torch.long
).unsqueeze(0)
attention_mask = torch.ones_like(
input_ids, device=model.device, dtype=torch.long
)
past_key_values = None
eos_token_id = [tokenizer.eos_token_id, 106]
for _ in range(100):
with torch.no_grad():
outputs = model.forward(
input_ids=input_ids, # type: ignore
attention_mask=attention_mask,
use_cache=True,
past_key_values=past_key_values
)
logits = outputs.logits
assert logits is not None
past_key_values = outputs.past_key_values
assert isinstance(past_key_values, HybridCache)
next_token = torch.argmax(logits[:, -1, :], dim=-1, keepdim=True)
token_id = next_token.item()
if eos_token_id is not None and token_id in eos_token_id:
break
print(tokenizer.decode(token_id), end="", flush=True)
while len(next_token.shape) < len(input_ids.shape):
next_token = next_token.unsqueeze(0)
input_ids = next_token
attention_mask = None
print()
model = AutoModelForCausalLM.from_pretrained("google/gemma-3-1b-it").to("cuda:0")
tokenizer = AutoTokenizer.from_pretrained("google/gemma-3-1b-it")
stream_with_cache(model, tokenizer, "Hello how are you")
# Here is the error
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [64,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [65,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [66,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [67,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [68,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [69,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [70,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [71,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [72,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [73,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [74,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [75,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [76,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [77,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [78,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [79,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [80,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [81,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [82,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [83,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [84,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [85,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [86,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [87,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [88,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [89,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [90,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [91,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [92,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [93,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [94,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [95,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [0,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [1,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [2,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [3,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [4,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [5,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [6,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [7,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [8,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [9,0,0] Assertion `idx >= 0 && idx < self_dim_size && "in
dex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [10,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [11,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [12,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [13,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [14,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [15,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [16,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [17,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [18,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [19,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [20,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [21,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [22,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [23,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [24,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [25,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [26,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [27,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [28,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [29,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [30,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [31,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [96,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [97,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [98,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [99,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [100,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [101,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [102,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [103,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [104,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [105,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [106,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [107,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [108,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [109,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [110,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [111,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [112,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [113,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [114,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [115,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [116,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [117,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [118,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [119,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [120,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [121,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [122,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [123,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [124,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [125,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [126,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [127,0,0] Assertion `idx >= 0 && idx < self_dim_size && "
index_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [32,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [33,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [34,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [35,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [36,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [37,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [38,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [39,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [40,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [41,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [42,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [43,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [44,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [45,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [46,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [47,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [48,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [49,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [50,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [51,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [52,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [53,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [54,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [55,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [56,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [57,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [58,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [59,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [60,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [61,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [62,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
/pytorch/aten/src/ATen/native/cuda/IndexKernel.cu:175: operator(): block:
[0,0,0], thread: [63,0,0] Assertion `idx >= 0 && idx < self_dim_size && "i
ndex_copy_(): index out of bounds"` failed.
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 28, in stream_with_cache
RuntimeError: CUDA error: device-side assert triggered
CUDA kernel errors might be asynchronously reported at some other API call
, so the stacktrace below might be incorrect.
For debugging consider passing CUDA_LAUNCH_BLOCKING=1
Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.
```
### Expected behavior
The expected behavior is that I can use the `past_key_values` so these do not need to be recalculated when using the model auto regressively. I have verified that the above works with "deepseek-ai/deepseek-coder-1.3b-instruct" but it does not work with "google/gemma-3-1b-it". | {
"login": "nickeisenberg",
"id": 64921400,
"node_id": "MDQ6VXNlcjY0OTIxNDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/64921400?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nickeisenberg",
"html_url": "https://github.com/nickeisenberg",
"followers_url": "https://api.github.com/users/nickeisenberg/followers",
"following_url": "https://api.github.com/users/nickeisenberg/following{/other_user}",
"gists_url": "https://api.github.com/users/nickeisenberg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nickeisenberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nickeisenberg/subscriptions",
"organizations_url": "https://api.github.com/users/nickeisenberg/orgs",
"repos_url": "https://api.github.com/users/nickeisenberg/repos",
"events_url": "https://api.github.com/users/nickeisenberg/events{/privacy}",
"received_events_url": "https://api.github.com/users/nickeisenberg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39593/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39593/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39592 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39592/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39592/comments | https://api.github.com/repos/huggingface/transformers/issues/39592/events | https://github.com/huggingface/transformers/pull/39592 | 3,253,751,542 | PR_kwDOCUB6oc6gIQ6B | 39,592 | Add Fast Image Processor for ImageGPT | {
"login": "agamjots05",
"id": 52331149,
"node_id": "MDQ6VXNlcjUyMzMxMTQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/52331149?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/agamjots05",
"html_url": "https://github.com/agamjots05",
"followers_url": "https://api.github.com/users/agamjots05/followers",
"following_url": "https://api.github.com/users/agamjots05/following{/other_user}",
"gists_url": "https://api.github.com/users/agamjots05/gists{/gist_id}",
"starred_url": "https://api.github.com/users/agamjots05/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/agamjots05/subscriptions",
"organizations_url": "https://api.github.com/users/agamjots05/orgs",
"repos_url": "https://api.github.com/users/agamjots05/repos",
"events_url": "https://api.github.com/users/agamjots05/events{/privacy}",
"received_events_url": "https://api.github.com/users/agamjots05/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T18:57:50 | 2025-09-04T22:45:42 | 2025-09-04T22:45:07 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39592",
"html_url": "https://github.com/huggingface/transformers/pull/39592",
"diff_url": "https://github.com/huggingface/transformers/pull/39592.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39592.patch",
"merged_at": "2025-09-04T22:45:07"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Related #36978
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [X] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [X] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [X] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39592/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39592/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39591 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39591/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39591/comments | https://api.github.com/repos/huggingface/transformers/issues/39591/events | https://github.com/huggingface/transformers/pull/39591 | 3,253,697,104 | PR_kwDOCUB6oc6gIE_o | 39,591 | 🚨[Fast Image Processor] Force Fast Image Processor for Qwen2_VL/2_5_VL + Refactor | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T18:37:38 | 2025-08-14T14:38:57 | 2025-07-25T15:11:28 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39591",
"html_url": "https://github.com/huggingface/transformers/pull/39591",
"diff_url": "https://github.com/huggingface/transformers/pull/39591.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39591.patch",
"merged_at": "2025-07-25T15:11:28"
} | # What does this PR do?
As discussed internally, this PR starts the process to make fast image processors the default in 🤗Transformers!
When instantiating a Processor or an Image Processor via `AutoProcessor.from_pretrained` or `AutoImageProcessor.from_pretrained` with a checkpoint using a `Qwen2VLImageProcessor`, the behavior will now be, to load `Qwen2VLImageProcessorFast` even if the processor was saved with a slow `Qwen2VLImageProcessor` originally.
For instance:
Old behavior:
```python
>> from transformers import AutoProcessor
>> processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct")
>> print(type(processor.image_processor))
<class 'transformers.models.qwen2_vl.image_processing_qwen2_vl.Qwen2VLImageProcessor'>
```
New behavior:
```python
>> from transformers import AutoProcessor
>> processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct")
>> print(type(processor.image_processor))
"""The image processor of type `Qwen2VLImageProcessor` is now loaded as a fast processor by default, even if the model checkpoint was saved with a slow processor. This is a breaking change and may produce slightly different outputs. To continue using the slow processor, instantiate this class with `use_fast=False`. Note that this behavior will be extended to all models in a future release."""
<class 'transformers.models.qwen2_vl.image_processing_qwen2_vl_fast.Qwen2VLImageProcessorFast'>
```
( The warning is a `warning_once`)
This PR also comes with a long overdue refactor (which should be 100% compatible with the slow image processor of qwen2 vl, and fix some existing inconsistencies with the fast one). Cc @zucchini-nlp for that :)
🚨The processed images in output between the slow and fast image processor are slightly different! This is expected as torchvision and PiL image processing functions are not fully equivalent.
Users can still force the use of a slow processor by loading the processor with `use_fast=False`
```python
>> from transformers import AutoProcessor
>> processor = AutoProcessor.from_pretrained("Qwen/Qwen2.5-VL-3B-Instruct", use_fast=False)
>> print(type(processor.image_processor))
<class 'transformers.models.qwen2_vl.image_processing_qwen2_vl.Qwen2VLImageProcessor'>
```
Here are some comparison between fast and slow processors with this refactor.
Mixed various means images of different sizes are included in input. The images used for these benchmarks is [this one](http://images.cocodataset.org/val2017/000000039769.jpg)
***Summary of the summary: up to 30x speedup, between 5e-8 and 3e-3 average output pixel differences depending on the processing parameters and input image sizes***
Summary: Max Output Difference vs. Slow processor
-
This table shows the maximum difference at any single point between the output tensors of the Fast processors and the Slow processor.
| Batch Size | 1 | 4 | 8 | 16 | 32 | 64 |
|:-----------------------------------------------------|----------:|----------:|----------:|----------:|----------:|----------:|
| ('mixed_various', 'Fast_cpu_grouping_disabled') | 2.384e-07 | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 |
| ('mixed_various', 'Fast_cpu_grouping_enabled') | 2.384e-07 | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 |
| ('uniform_1024x1024', 'Fast_cpu_grouping_disabled') | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 |
| ('uniform_1024x1024', 'Fast_cpu_grouping_enabled') | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 | 0.0292 |
| ('uniform_224x224', 'Fast_cpu_grouping_disabled') | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 |
| ('uniform_224x224', 'Fast_cpu_grouping_enabled') | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 |
| ('uniform_512x512', 'Fast_cpu_grouping_disabled') | 0.01501 | 0.01501 | 0.01501 | 0.01501 | 0.01501 | 0.01501 |
| ('uniform_512x512', 'Fast_cpu_grouping_enabled') | 0.01501 | 0.01501 | 0.01501 | 0.01501 | 0.01501 | 0.01501 |
| ('mixed_various', 'Fast_cuda_grouping_disabled') | 2.384e-07 | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 |
| ('mixed_various', 'Fast_cuda_grouping_enabled') | 2.384e-07 | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 |
| ('uniform_1024x1024', 'Fast_cuda_grouping_disabled') | 0.04266 | 0.04266 | 0.04266 | 0.04266 | 0.04266 | 0.04266 |
| ('uniform_1024x1024', 'Fast_cuda_grouping_enabled') | 0.04266 | 0.04266 | 0.04266 | 0.04266 | 0.04266 | 0.04266 |
| ('uniform_224x224', 'Fast_cuda_grouping_disabled') | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 |
| ('uniform_224x224', 'Fast_cuda_grouping_enabled') | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 | 2.384e-07 |
| ('uniform_512x512', 'Fast_cuda_grouping_disabled') | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 |
| ('uniform_512x512', 'Fast_cuda_grouping_enabled') | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 | 0.09005 |
Summary: Mean Absolute Output Difference vs. Slow processor
-
This table shows the mean absolute difference between the output tensors of the Fast processors and the Slow processor for each configuration and image scenario.
| Batch Size | 1 | 4 | 8 | 16 | 32 | 64 |
|:-----------------------------------------------------|----------:|----------:|----------:|----------:|----------:|----------:|
| ('mixed_various', 'Fast_cpu_grouping_disabled') | 5.315e-08 | 7.732e-05 | 7.452e-05 | 7.956e-05 | 7.892e-05 | 8e-05 |
| ('mixed_various', 'Fast_cpu_grouping_enabled') | 5.315e-08 | 7.732e-05 | 7.452e-05 | 7.956e-05 | 7.892e-05 | 8e-05 |
| ('uniform_1024x1024', 'Fast_cpu_grouping_disabled') | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 |
| ('uniform_1024x1024', 'Fast_cpu_grouping_enabled') | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 | 9.615e-05 |
| ('uniform_224x224', 'Fast_cpu_grouping_disabled') | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 |
| ('uniform_224x224', 'Fast_cpu_grouping_enabled') | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 |
| ('uniform_512x512', 'Fast_cpu_grouping_disabled') | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 |
| ('uniform_512x512', 'Fast_cpu_grouping_enabled') | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 | 2.832e-05 |
| ('mixed_various', 'Fast_cuda_grouping_disabled') | 5.315e-08 | 0.002611 | 0.002679 | 0.002686 | 0.0027 | 0.002701 |
| ('mixed_various', 'Fast_cuda_grouping_enabled') | 5.315e-08 | 0.002611 | 0.002679 | 0.002686 | 0.0027 | 0.002701 |
| ('uniform_1024x1024', 'Fast_cuda_grouping_disabled') | 0.002783 | 0.002783 | 0.002783 | 0.002783 | 0.002783 | 0.002783 |
| ('uniform_1024x1024', 'Fast_cuda_grouping_enabled') | 0.002783 | 0.002783 | 0.002783 | 0.002783 | 0.002783 | 0.002783 |
| ('uniform_224x224', 'Fast_cuda_grouping_disabled') | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 |
| ('uniform_224x224', 'Fast_cuda_grouping_enabled') | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 | 5.315e-08 |
| ('uniform_512x512', 'Fast_cuda_grouping_disabled') | 0.002913 | 0.002913 | 0.002913 | 0.002913 | 0.002913 | 0.002913 |
| ('uniform_512x512', 'Fast_cuda_grouping_enabled') | 0.002913 | 0.002913 | 0.002913 | 0.002913 | 0.002913 | 0.002913 |
Time per images:
-
<img width="4800" height="2700" alt="time_per_image_all_configs" src="https://github.com/user-attachments/assets/a8022f88-c164-4a05-abb7-265fd8837bf3" />
With different image sizes:
| | |
|:-------------------------:|:-------------------------:|
|<img width="4800" height="2700" alt="time_per_image_all_configs" src="https://github.com/user-attachments/assets/a8022f88-c164-4a05-abb7-265fd8837bf3" /> | <img width="4800" height="2700" alt="time_per_image_all_configs" src="https://github.com/user-attachments/assets/2e4d7f4a-95a0-462c-a941-eaa8da2d6140" />|
|<img width="4800" height="2700" alt="time_per_image_all_configs" src="https://github.com/user-attachments/assets/04a6cc82-3dbe-47a5-9931-a18aa88032ef" /> |<img width="4800" height="2700" alt="time_per_image_all_configs" src="https://github.com/user-attachments/assets/02e606db-41cb-4e53-b213-eb2d87a7e0b0" />|
Speedups:
-
<img width="4800" height="2700" alt="speedup_vs_slow" src="https://github.com/user-attachments/assets/d38b0354-9ebf-4958-9488-5110745d77bf" />
With different image sizes:
| | |
|:-------------------------:|:-------------------------:|
|<img width="4800" height="2700" alt="speedup_vs_slow" src="https://github.com/user-attachments/assets/d38b0354-9ebf-4958-9488-5110745d77bf" /> | <img width="4800" height="2700" alt="speedup_vs_slow" src="https://github.com/user-attachments/assets/3a6d68d8-9bd9-4dfb-97ab-1079e1235947" />|
|<img width="4800" height="2700" alt="speedup_vs_slow" src="https://github.com/user-attachments/assets/b758938b-e0e3-46e8-ad10-0e0ad249ff6d" /> |<img width="4800" height="2700" alt="speedup_vs_slow" src="https://github.com/user-attachments/assets/04c5536c-68fd-42ae-bb9c-f80ae5474ad2" />|
Cc @qubvel @ArthurZucker @Cyrilvallez
| {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39591/reactions",
"total_count": 7,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 7,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39591/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39590 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39590/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39590/comments | https://api.github.com/repos/huggingface/transformers/issues/39590/events | https://github.com/huggingface/transformers/pull/39590 | 3,253,432,452 | PR_kwDOCUB6oc6gHKwZ | 39,590 | Fix DynamicCache and simplify Cache classes a bit | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T17:01:52 | 2025-07-23T08:13:46 | 2025-07-23T08:13:45 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39590",
"html_url": "https://github.com/huggingface/transformers/pull/39590",
"diff_url": "https://github.com/huggingface/transformers/pull/39590.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39590.patch",
"merged_at": "2025-07-23T08:13:45"
} | # What does this PR do?
As per the title. DynamicCache was broken because `self.layers` needs to exist if we want to append to it -> `test_multi_gpu_data_parallel_forward` was failing
cc @manueldeprada for viz | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39590/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39590/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39589 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39589/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39589/comments | https://api.github.com/repos/huggingface/transformers/issues/39589/events | https://github.com/huggingface/transformers/pull/39589 | 3,253,341,281 | PR_kwDOCUB6oc6gG3Ds | 39,589 | Fix link in "Inference server backends" doc | {
"login": "hmellor",
"id": 19981378,
"node_id": "MDQ6VXNlcjE5OTgxMzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/19981378?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hmellor",
"html_url": "https://github.com/hmellor",
"followers_url": "https://api.github.com/users/hmellor/followers",
"following_url": "https://api.github.com/users/hmellor/following{/other_user}",
"gists_url": "https://api.github.com/users/hmellor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hmellor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hmellor/subscriptions",
"organizations_url": "https://api.github.com/users/hmellor/orgs",
"repos_url": "https://api.github.com/users/hmellor/repos",
"events_url": "https://api.github.com/users/hmellor/events{/privacy}",
"received_events_url": "https://api.github.com/users/hmellor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T16:31:29 | 2025-07-22T16:45:31 | 2025-07-22T16:44:09 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39589",
"html_url": "https://github.com/huggingface/transformers/pull/39589",
"diff_url": "https://github.com/huggingface/transformers/pull/39589.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39589.patch",
"merged_at": "2025-07-22T16:44:09"
} | The link now points to the correct place in the vLLM docs | {
"login": "hmellor",
"id": 19981378,
"node_id": "MDQ6VXNlcjE5OTgxMzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/19981378?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hmellor",
"html_url": "https://github.com/hmellor",
"followers_url": "https://api.github.com/users/hmellor/followers",
"following_url": "https://api.github.com/users/hmellor/following{/other_user}",
"gists_url": "https://api.github.com/users/hmellor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hmellor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hmellor/subscriptions",
"organizations_url": "https://api.github.com/users/hmellor/orgs",
"repos_url": "https://api.github.com/users/hmellor/repos",
"events_url": "https://api.github.com/users/hmellor/events{/privacy}",
"received_events_url": "https://api.github.com/users/hmellor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39589/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39588 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39588/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39588/comments | https://api.github.com/repos/huggingface/transformers/issues/39588/events | https://github.com/huggingface/transformers/pull/39588 | 3,253,327,551 | PR_kwDOCUB6oc6gGz-w | 39,588 | WIP, reference modeling | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-22T16:27:11 | 2025-07-22T16:40:32 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39588",
"html_url": "https://github.com/huggingface/transformers/pull/39588",
"diff_url": "https://github.com/huggingface/transformers/pull/39588.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39588.patch",
"merged_at": null
} | # What does this PR do?
WIP, too many divergences in code/arch for models that are very similar, the idea is to have a "dummy" model that is functional, based on a recent omni model, and then derive functional blocks from it through `modular`. Pushing this as a PR to commit to it and build this up
<img width="1008" height="874" alt="image" src="https://github.com/user-attachments/assets/0ecc714a-4b78-445f-94b5-c6babf96129a" />
Currently VLMs+audio do not exploit enough modular IMO
```bash
- qwen2_5_omni (4006 lines in modeling, 4307 in modular)
- gemma3n (2391 lines in modeling, 2659 in modular)
- d_fine (2191 lines in modeling, 1213 in modular)
- informer (2162 lines in modeling, 978 in modular)
- rt_detr_v2 (2001 lines in modeling, 628 in modular)
- wav2vec2_conformer (1934 lines in modeling, 730 in modular)
- phi4_multimodal (1897 lines in modeling, 1744 in modular)
- instructblipvideo (1792 lines in modeling, 657 in modular)
- zamba2 (1742 lines in modeling, 1151 in modular)
- qwen2_5_vl (1734 lines in modeling, 1061 in modular)
- plbart (1722 lines in modeling, 663 in modular)
- glm4v (1707 lines in modeling, 1724 in modular)
- falcon_h1 (1625 lines in modeling, 1377 in modular)
- emu3 (1624 lines in modeling, 1206 in modular)
- wav2vec2_bert (1517 lines in modeling, 1070 in modular)
- bamba (1511 lines in modeling, 1219 in modular)
- sam_hq (1497 lines in modeling, 653 in modular)
- janus (1422 lines in modeling, 1592 in modular)
- modernbert (1398 lines in modeling, 1527 in modular)
- t5gemma (1388 lines in modeling, 1257 in modular)
- gemma3 (1301 lines in modeling, 1184 in modular)
- siglip2 (1301 lines in modeling, 637 in modular)
- aria (1300 lines in modeling, 1665 in modular)
- minimax (1142 lines in modeling, 602 in modular)
- moonshine (1096 lines in modeling, 920 in modular)
- csm (1091 lines in modeling, 768 in modular)
```
+ many do not have a modular file (yet)
```bash
- seamless_m4t_v2 (4402 lines in modeling file)
- seamless_m4t (4073 lines in modeling file)
- perceiver (3406 lines in modeling file)
- speecht5 (3247 lines in modeling file)
- oneformer (3202 lines in modeling file)
- bigbird_pegasus (3031 lines in modeling file)
- big_bird (2960 lines in modeling file)
- reformer (2776 lines in modeling file)
- led (2536 lines in modeling file)
- moshi (2512 lines in modeling file)
- mask2former (2476 lines in modeling file)
- blip_2 (2459 lines in modeling file)
- mt5 (2457 lines in modeling file)
- musicgen (2439 lines in modeling file)
- t5 (2408 lines in modeling file)
- xlnet (2380 lines in modeling file)
- wav2vec2 (2360 lines in modeling file)
- tapas (2355 lines in modeling file)
- musicgen_melody (2273 lines in modeling file)
- auto (2241 lines in modeling file)
- longformer (2223 lines in modeling file)
- longt5 (2196 lines in modeling file)
- luke (2180 lines in modeling file)
- autoformer (2123 lines in modeling file)
- patchtsmixer (2122 lines in modeling file)
- prophetnet (2048 lines in modeling file)
- flava (2039 lines in modeling file)
- udop (2006 lines in modeling file)
``` | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39588/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39588/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39587 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39587/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39587/comments | https://api.github.com/repos/huggingface/transformers/issues/39587/events | https://github.com/huggingface/transformers/pull/39587 | 3,253,290,838 | PR_kwDOCUB6oc6gGr7s | 39,587 | fix(tokenization): check token.content for trie | {
"login": "pjo256",
"id": 2179163,
"node_id": "MDQ6VXNlcjIxNzkxNjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2179163?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pjo256",
"html_url": "https://github.com/pjo256",
"followers_url": "https://api.github.com/users/pjo256/followers",
"following_url": "https://api.github.com/users/pjo256/following{/other_user}",
"gists_url": "https://api.github.com/users/pjo256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pjo256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pjo256/subscriptions",
"organizations_url": "https://api.github.com/users/pjo256/orgs",
"repos_url": "https://api.github.com/users/pjo256/repos",
"events_url": "https://api.github.com/users/pjo256/events{/privacy}",
"received_events_url": "https://api.github.com/users/pjo256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T16:14:53 | 2025-07-28T09:28:56 | 2025-07-28T09:28:56 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39587",
"html_url": "https://github.com/huggingface/transformers/pull/39587",
"diff_url": "https://github.com/huggingface/transformers/pull/39587.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39587.patch",
"merged_at": "2025-07-28T09:28:56"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes https://github.com/huggingface/transformers/issues/39586
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@ArthurZucker , @itazap
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39587/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39587/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39586 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39586/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39586/comments | https://api.github.com/repos/huggingface/transformers/issues/39586/events | https://github.com/huggingface/transformers/issues/39586 | 3,253,283,799 | I_kwDOCUB6oc7B6SvX | 39,586 | AddedToken should check content on `_update` | {
"login": "pjo256",
"id": 2179163,
"node_id": "MDQ6VXNlcjIxNzkxNjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2179163?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pjo256",
"html_url": "https://github.com/pjo256",
"followers_url": "https://api.github.com/users/pjo256/followers",
"following_url": "https://api.github.com/users/pjo256/following{/other_user}",
"gists_url": "https://api.github.com/users/pjo256/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pjo256/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pjo256/subscriptions",
"organizations_url": "https://api.github.com/users/pjo256/orgs",
"repos_url": "https://api.github.com/users/pjo256/repos",
"events_url": "https://api.github.com/users/pjo256/events{/privacy}",
"received_events_url": "https://api.github.com/users/pjo256/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-22T16:12:15 | 2025-07-28T09:28:57 | 2025-07-28T09:28:57 | CONTRIBUTOR | null | null | null | null | ### System Info
- `transformers` version: 4.54.0
- Platform: macOS-14.4-arm64-arm-64bit
- Python version: 3.11.12
- Huggingface_hub version: 0.33.4
- Safetensors version: 0.5.3
- Accelerate version: 1.9.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (accelerator?): 2.7.1 (NA)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: no
### Who can help?
@ArthurZucker, @itazap
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
Hey! While doing some benchmarking, I noticed that `_update_trie` will unnecessarily call `trie.add` for each token added to the tokenizer. When adding special tokens to large vocabularies, every `AddedToken` will be re-traversed in the trie despite just adding a few special tokens.
The simplified script below shows ~vocab_size additional calls when compared to removing the last `add_special_tokens` call.
```python
import cProfile
import pstats
from transformers import BertTokenizer
tokenizer1 = BertTokenizer.from_pretrained("bert-base-multilingual-cased")
tokenizer2 = BertTokenizer.from_pretrained("bert-base-cased")
tokens_to_add = set(tokenizer2.vocab).difference(tokenizer1.vocab)
def run():
tokenizer1.add_tokens(list(tokens_to_add)) #len(tokenizer1.get_vocab()) ~= 130k
tokenizer1.add_special_tokens({'additional_special_tokens': ['<|start_of_turn|>', '<|end_of_turn|>']})
profiler = cProfile.Profile()
profiler.enable()
run()
profiler.disable()
stats = pstats.Stats(profiler)
stats.sort_stats('cumtime') # Sort by cumulative time
stats.print_stats('Trie.add')
```
```
446404 function calls in 0.140 seconds #with add_special_tokens
335064 function calls in 0.101 seconds
```
### Expected behavior
I'd expect to see less `add` calls when adding such a small number of special tokens | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39586/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39586/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39585 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39585/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39585/comments | https://api.github.com/repos/huggingface/transformers/issues/39585/events | https://github.com/huggingface/transformers/pull/39585 | 3,253,206,234 | PR_kwDOCUB6oc6gGZmB | 39,585 | [`Ernie 4.5`] Ernie VL models | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-22T15:45:07 | 2025-10-29T17:41:22 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39585",
"html_url": "https://github.com/huggingface/transformers/pull/39585",
"diff_url": "https://github.com/huggingface/transformers/pull/39585.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39585.patch",
"merged_at": null
} | Continuation of #39228 for the VL models
Current inference script for testing (torch 2.6):
```python
import requests
from PIL import Image
from transformers import AutoModelForImageTextToText, AutoProcessor
use_fast = False
model_path = "/raid/anton/code/forks/transformers/src/transformers/models/ernie4_5_vl/AntonV/ErnieVL"
processor_kwargs = {} if not use_fast else {"use_fast": True}
processor = AutoProcessor.from_pretrained(model_path, **processor_kwargs)
model = AutoModelForImageTextToText.from_pretrained(
model_path,
device_map="auto",
dtype="auto",
)
messages = [
{
"role": "user",
"content": [
{"type": "text", "text": "Only use English during your responses and describe the following image."},
{"type": "image"},
]
},
]
text = processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
image = Image.open(requests.get("https://paddlenlp.bj.bcebos.com/datasets/paddlemix/demo_images/example1.jpg", stream=True).raw)
inputs = processor(text=[text], images=[image], return_tensors="pt").to(model.device)
generated_ids = model.generate(
**inputs,
max_new_tokens=64,
do_sample=False,
)
print(processor.decode(generated_ids[0][len(inputs['input_ids'][0]):]))#"""
```
Output:
`The image features a person sitting on a hilltop, gazing out at a vast mountain range. The person is wrapped in a colorful, striped blanket, and their head is covered with a red headscarf. The foreground includes vibrant pink flowers, adding a pop of color to the scene. The background show`
Left TODOs:
- [ ] Integration tests
- [ ] Handle font (loading) in a proper way / torch only version (?)
- [ ] Make it optional feature with warnings on compile
- [ ] Converted models on the hub | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39585/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39585/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39584 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39584/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39584/comments | https://api.github.com/repos/huggingface/transformers/issues/39584/events | https://github.com/huggingface/transformers/pull/39584 | 3,253,144,043 | PR_kwDOCUB6oc6gGLzA | 39,584 | Generic task-specific base classes | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T15:27:20 | 2025-07-23T08:49:50 | 2025-07-23T08:49:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39584",
"html_url": "https://github.com/huggingface/transformers/pull/39584",
"diff_url": "https://github.com/huggingface/transformers/pull/39584.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39584.patch",
"merged_at": "2025-07-23T08:49:47"
} | # What does this PR do?
As per the title. The easiest and most general approach is to use abstract classes where everything is defined, similar to a Mixin. That way we don't need to redefine the same every time, but it still appears explicitly in the modeling, and we have no conflicts with the current classes in the public imports
On the road for the great unbloating!
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39584/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39584/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39583 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39583/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39583/comments | https://api.github.com/repos/huggingface/transformers/issues/39583/events | https://github.com/huggingface/transformers/pull/39583 | 3,253,084,981 | PR_kwDOCUB6oc6gF-qB | 39,583 | Update recent processors for vLLM backend | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T15:10:24 | 2025-07-24T08:29:28 | 2025-07-24T08:29:28 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39583",
"html_url": "https://github.com/huggingface/transformers/pull/39583",
"diff_url": "https://github.com/huggingface/transformers/pull/39583.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39583.patch",
"merged_at": "2025-07-24T08:29:28"
} | # What does this PR do?
As peer title, models needed tiny changes here and there | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39583/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39583/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39582 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39582/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39582/comments | https://api.github.com/repos/huggingface/transformers/issues/39582/events | https://github.com/huggingface/transformers/pull/39582 | 3,252,804,019 | PR_kwDOCUB6oc6gFCBm | 39,582 | [attention] fix test for packed padfree masking | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T13:55:14 | 2025-07-25T07:44:52 | 2025-07-25T07:44:52 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39582",
"html_url": "https://github.com/huggingface/transformers/pull/39582",
"diff_url": "https://github.com/huggingface/transformers/pull/39582.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39582.patch",
"merged_at": "2025-07-25T07:44:52"
} | # What does this PR do?
As per title, I added the test a few days ago as slow, so I can fix them later for all models
| {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39582/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39582/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39581 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39581/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39581/comments | https://api.github.com/repos/huggingface/transformers/issues/39581/events | https://github.com/huggingface/transformers/pull/39581 | 3,252,617,951 | PR_kwDOCUB6oc6gEY1Q | 39,581 | fix moe routing_weights | {
"login": "llbdyiu66",
"id": 125861386,
"node_id": "U_kgDOB4B-Cg",
"avatar_url": "https://avatars.githubusercontent.com/u/125861386?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/llbdyiu66",
"html_url": "https://github.com/llbdyiu66",
"followers_url": "https://api.github.com/users/llbdyiu66/followers",
"following_url": "https://api.github.com/users/llbdyiu66/following{/other_user}",
"gists_url": "https://api.github.com/users/llbdyiu66/gists{/gist_id}",
"starred_url": "https://api.github.com/users/llbdyiu66/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/llbdyiu66/subscriptions",
"organizations_url": "https://api.github.com/users/llbdyiu66/orgs",
"repos_url": "https://api.github.com/users/llbdyiu66/repos",
"events_url": "https://api.github.com/users/llbdyiu66/events{/privacy}",
"received_events_url": "https://api.github.com/users/llbdyiu66/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T13:05:28 | 2025-07-23T11:20:57 | 2025-07-23T11:20:23 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39581",
"html_url": "https://github.com/huggingface/transformers/pull/39581",
"diff_url": "https://github.com/huggingface/transformers/pull/39581.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39581.patch",
"merged_at": "2025-07-23T11:20:23"
} | About `Ernie4_5_MoEStatics`, e_score_correction_bias only affects selected_experts, but not routing_weights. | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39581/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39581/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39580 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39580/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39580/comments | https://api.github.com/repos/huggingface/transformers/issues/39580/events | https://github.com/huggingface/transformers/pull/39580 | 3,252,365,243 | PR_kwDOCUB6oc6gDhjs | 39,580 | Torchdec RuntimeError catch | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T11:53:06 | 2025-07-22T16:35:04 | 2025-07-22T16:35:03 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39580",
"html_url": "https://github.com/huggingface/transformers/pull/39580",
"diff_url": "https://github.com/huggingface/transformers/pull/39580.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39580.patch",
"merged_at": "2025-07-22T16:35:03"
} | # What does this PR do ?
This PR fixes an issue with torchcodec in print_env.py. We forgot to deal with the case where torchcodec raises a RuntimeError error. Currently some of our training [ci](https://github.com/huggingface/transformers/actions/runs/16433389284/job/46438805785) are not running because of that. | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39580/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39580/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39579 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39579/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39579/comments | https://api.github.com/repos/huggingface/transformers/issues/39579/events | https://github.com/huggingface/transformers/pull/39579 | 3,252,330,032 | PR_kwDOCUB6oc6gDZ3B | 39,579 | General weight initialization scheme | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T11:41:41 | 2025-07-22T14:04:22 | 2025-07-22T14:04:20 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39579",
"html_url": "https://github.com/huggingface/transformers/pull/39579",
"diff_url": "https://github.com/huggingface/transformers/pull/39579.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39579.patch",
"merged_at": "2025-07-22T14:04:20"
} | # What does this PR do?
As per the title. Models with standard init scheme should not always have te redefine it. This cleans-up quite a lot of code in the modelings | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39579/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39579/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39578 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39578/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39578/comments | https://api.github.com/repos/huggingface/transformers/issues/39578/events | https://github.com/huggingface/transformers/pull/39578 | 3,252,206,168 | PR_kwDOCUB6oc6gC-dx | 39,578 | 🌐 [i18n-KO] Translated `tvp.md` to Korean | {
"login": "Kim-Ju-won",
"id": 81630351,
"node_id": "MDQ6VXNlcjgxNjMwMzUx",
"avatar_url": "https://avatars.githubusercontent.com/u/81630351?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kim-Ju-won",
"html_url": "https://github.com/Kim-Ju-won",
"followers_url": "https://api.github.com/users/Kim-Ju-won/followers",
"following_url": "https://api.github.com/users/Kim-Ju-won/following{/other_user}",
"gists_url": "https://api.github.com/users/Kim-Ju-won/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kim-Ju-won/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kim-Ju-won/subscriptions",
"organizations_url": "https://api.github.com/users/Kim-Ju-won/orgs",
"repos_url": "https://api.github.com/users/Kim-Ju-won/repos",
"events_url": "https://api.github.com/users/Kim-Ju-won/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kim-Ju-won/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T11:00:02 | 2025-07-29T15:04:00 | 2025-07-29T15:04:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39578",
"html_url": "https://github.com/huggingface/transformers/pull/39578",
"diff_url": "https://github.com/huggingface/transformers/pull/39578.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39578.patch",
"merged_at": "2025-07-29T15:04:00"
} | # What does this PR do?
Translated the `tvp.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
@stevhliu | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39578/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/39578/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39577 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39577/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39577/comments | https://api.github.com/repos/huggingface/transformers/issues/39577/events | https://github.com/huggingface/transformers/pull/39577 | 3,252,199,257 | PR_kwDOCUB6oc6gC87a | 39,577 | 🌐 [i18n-KO] Translated `pipelines.md` to Korean | {
"login": "xhaktm00",
"id": 153787023,
"node_id": "U_kgDOCSqajw",
"avatar_url": "https://avatars.githubusercontent.com/u/153787023?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xhaktm00",
"html_url": "https://github.com/xhaktm00",
"followers_url": "https://api.github.com/users/xhaktm00/followers",
"following_url": "https://api.github.com/users/xhaktm00/following{/other_user}",
"gists_url": "https://api.github.com/users/xhaktm00/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xhaktm00/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xhaktm00/subscriptions",
"organizations_url": "https://api.github.com/users/xhaktm00/orgs",
"repos_url": "https://api.github.com/users/xhaktm00/repos",
"events_url": "https://api.github.com/users/xhaktm00/events{/privacy}",
"received_events_url": "https://api.github.com/users/xhaktm00/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T10:57:46 | 2025-08-13T17:26:18 | 2025-08-13T17:26:18 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39577",
"html_url": "https://github.com/huggingface/transformers/pull/39577",
"diff_url": "https://github.com/huggingface/transformers/pull/39577.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39577.patch",
"merged_at": "2025-08-13T17:26:17"
} | # What does this PR do?
Translated the `pipelines.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review? (Final)
@stevhliu May you please review this PR? | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39577/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/39577/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39576 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39576/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39576/comments | https://api.github.com/repos/huggingface/transformers/issues/39576/events | https://github.com/huggingface/transformers/pull/39576 | 3,252,078,771 | PR_kwDOCUB6oc6gCiUp | 39,576 | Fix important models CI | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T10:19:46 | 2025-07-23T14:24:30 | 2025-07-23T14:24:29 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39576",
"html_url": "https://github.com/huggingface/transformers/pull/39576",
"diff_url": "https://github.com/huggingface/transformers/pull/39576.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39576.patch",
"merged_at": "2025-07-23T14:24:29"
} | # What does this PR do?
Minor. Fixes a few tests failing currently on the important test action, on llama/gemma/mistral/mixtral. | {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39576/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39576/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39575 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39575/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39575/comments | https://api.github.com/repos/huggingface/transformers/issues/39575/events | https://github.com/huggingface/transformers/pull/39575 | 3,252,018,510 | PR_kwDOCUB6oc6gCUs1 | 39,575 | 🌐 [i18n-KO] Translated `vitpose.md` to Korean | {
"login": "jihyun-0611",
"id": 78160653,
"node_id": "MDQ6VXNlcjc4MTYwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/78160653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jihyun-0611",
"html_url": "https://github.com/jihyun-0611",
"followers_url": "https://api.github.com/users/jihyun-0611/followers",
"following_url": "https://api.github.com/users/jihyun-0611/following{/other_user}",
"gists_url": "https://api.github.com/users/jihyun-0611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jihyun-0611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jihyun-0611/subscriptions",
"organizations_url": "https://api.github.com/users/jihyun-0611/orgs",
"repos_url": "https://api.github.com/users/jihyun-0611/repos",
"events_url": "https://api.github.com/users/jihyun-0611/events{/privacy}",
"received_events_url": "https://api.github.com/users/jihyun-0611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-22T10:05:09 | 2025-07-31T12:13:32 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39575",
"html_url": "https://github.com/huggingface/transformers/pull/39575",
"diff_url": "https://github.com/huggingface/transformers/pull/39575.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39575.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `vitpose.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39575/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/39575/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39574 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39574/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39574/comments | https://api.github.com/repos/huggingface/transformers/issues/39574/events | https://github.com/huggingface/transformers/pull/39574 | 3,251,763,236 | PR_kwDOCUB6oc6gBcLj | 39,574 | feat(autoformer): Improve ValueError for insufficient sequence length | {
"login": "rev2607",
"id": 117919399,
"node_id": "U_kgDOBwdOpw",
"avatar_url": "https://avatars.githubusercontent.com/u/117919399?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rev2607",
"html_url": "https://github.com/rev2607",
"followers_url": "https://api.github.com/users/rev2607/followers",
"following_url": "https://api.github.com/users/rev2607/following{/other_user}",
"gists_url": "https://api.github.com/users/rev2607/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rev2607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rev2607/subscriptions",
"organizations_url": "https://api.github.com/users/rev2607/orgs",
"repos_url": "https://api.github.com/users/rev2607/repos",
"events_url": "https://api.github.com/users/rev2607/events{/privacy}",
"received_events_url": "https://api.github.com/users/rev2607/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T08:54:38 | 2025-09-05T04:00:17 | 2025-08-12T12:19:49 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39574",
"html_url": "https://github.com/huggingface/transformers/pull/39574",
"diff_url": "https://github.com/huggingface/transformers/pull/39574.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39574.patch",
"merged_at": null
} | ### What does this PR do?
This PR addresses an issue where the `AutoformerModel` raises a confusing `ValueError` when the input `past_values` sequence is too short to create the necessary lag features. The previous error message was not informative, making it difficult for users to debug their input shapes.
This change replaces the old validation check with a new one that:
1. Calculates the minimum required sequence length based on the model's configuration (`context_length`, `prediction_length`, and `lags_sequence`).
2. Raises a clear, descriptive `ValueError` that explicitly tells the user the required input length versus the length they provided, making the issue easy to fix.
This significantly improves the user experience for anyone using the Autoformer model.
---
Fixes #39460
### Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
### Who can review?
@kashif (since you commented on the original issue) | {
"login": "rev2607",
"id": 117919399,
"node_id": "U_kgDOBwdOpw",
"avatar_url": "https://avatars.githubusercontent.com/u/117919399?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rev2607",
"html_url": "https://github.com/rev2607",
"followers_url": "https://api.github.com/users/rev2607/followers",
"following_url": "https://api.github.com/users/rev2607/following{/other_user}",
"gists_url": "https://api.github.com/users/rev2607/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rev2607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rev2607/subscriptions",
"organizations_url": "https://api.github.com/users/rev2607/orgs",
"repos_url": "https://api.github.com/users/rev2607/repos",
"events_url": "https://api.github.com/users/rev2607/events{/privacy}",
"received_events_url": "https://api.github.com/users/rev2607/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39574/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39574/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39573 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39573/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39573/comments | https://api.github.com/repos/huggingface/transformers/issues/39573/events | https://github.com/huggingface/transformers/pull/39573 | 3,251,684,938 | PR_kwDOCUB6oc6gBK-Y | 39,573 | xpu optimization for generation case | {
"login": "sywangyi",
"id": 36058628,
"node_id": "MDQ6VXNlcjM2MDU4NjI4",
"avatar_url": "https://avatars.githubusercontent.com/u/36058628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sywangyi",
"html_url": "https://github.com/sywangyi",
"followers_url": "https://api.github.com/users/sywangyi/followers",
"following_url": "https://api.github.com/users/sywangyi/following{/other_user}",
"gists_url": "https://api.github.com/users/sywangyi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sywangyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sywangyi/subscriptions",
"organizations_url": "https://api.github.com/users/sywangyi/orgs",
"repos_url": "https://api.github.com/users/sywangyi/repos",
"events_url": "https://api.github.com/users/sywangyi/events{/privacy}",
"received_events_url": "https://api.github.com/users/sywangyi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T08:32:43 | 2025-07-28T09:34:58 | 2025-07-28T09:34:58 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39573",
"html_url": "https://github.com/huggingface/transformers/pull/39573",
"diff_url": "https://github.com/huggingface/transformers/pull/39573.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39573.patch",
"merged_at": "2025-07-28T09:34:58"
} | null | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39573/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39573/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39572 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39572/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39572/comments | https://api.github.com/repos/huggingface/transformers/issues/39572/events | https://github.com/huggingface/transformers/pull/39572 | 3,251,372,906 | PR_kwDOCUB6oc6gAGBj | 39,572 | fix(voxtral): correct typo in apply_transcription_request | {
"login": "rev2607",
"id": 117919399,
"node_id": "U_kgDOBwdOpw",
"avatar_url": "https://avatars.githubusercontent.com/u/117919399?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rev2607",
"html_url": "https://github.com/rev2607",
"followers_url": "https://api.github.com/users/rev2607/followers",
"following_url": "https://api.github.com/users/rev2607/following{/other_user}",
"gists_url": "https://api.github.com/users/rev2607/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rev2607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rev2607/subscriptions",
"organizations_url": "https://api.github.com/users/rev2607/orgs",
"repos_url": "https://api.github.com/users/rev2607/repos",
"events_url": "https://api.github.com/users/rev2607/events{/privacy}",
"received_events_url": "https://api.github.com/users/rev2607/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 6470596964,
"node_id": "LA_kwDOCUB6oc8AAAABga15ZA",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Audio",
"name": "Audio",
"color": "760453",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-22T06:56:37 | 2025-07-25T12:09:44 | 2025-07-25T12:09:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39572",
"html_url": "https://github.com/huggingface/transformers/pull/39572",
"diff_url": "https://github.com/huggingface/transformers/pull/39572.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39572.patch",
"merged_at": "2025-07-25T12:09:44"
} | # What does this PR do?
Fixes a typo in the method name `apply_transcrition_request` → `apply_transcription_request` inside `src/transformers/models/voxtral/processing_voxtral.py`.
This PR addresses [issue #39530](https://github.com/huggingface/transformers/issues/39530). The method was misspelled and is now corrected for clarity and consistency. This change may also require updates in model cards if this method is referenced there.
Fixes #39530
## Before submitting
- [x] This PR fixes a typo or improves the docs.
- [x] I have read the [contributor guidelines](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request).
- [x] This was discussed/approved via [issue #39530](https://github.com/huggingface/transformers/issues/39530).
- [ ] No new documentation or tests were required for this minor change.
## Who can review?
@speech models: @eustlb
| {
"login": "eustlb",
"id": 94853470,
"node_id": "U_kgDOBadZXg",
"avatar_url": "https://avatars.githubusercontent.com/u/94853470?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eustlb",
"html_url": "https://github.com/eustlb",
"followers_url": "https://api.github.com/users/eustlb/followers",
"following_url": "https://api.github.com/users/eustlb/following{/other_user}",
"gists_url": "https://api.github.com/users/eustlb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eustlb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eustlb/subscriptions",
"organizations_url": "https://api.github.com/users/eustlb/orgs",
"repos_url": "https://api.github.com/users/eustlb/repos",
"events_url": "https://api.github.com/users/eustlb/events{/privacy}",
"received_events_url": "https://api.github.com/users/eustlb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39572/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39572/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39571 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39571/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39571/comments | https://api.github.com/repos/huggingface/transformers/issues/39571/events | https://github.com/huggingface/transformers/pull/39571 | 3,251,077,162 | PR_kwDOCUB6oc6f_FnF | 39,571 | 🌐 [i18n-KO] Translated `auto_docstring.md` to Korean | {
"login": "chelsseeey",
"id": 152389483,
"node_id": "U_kgDOCRVHaw",
"avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chelsseeey",
"html_url": "https://github.com/chelsseeey",
"followers_url": "https://api.github.com/users/chelsseeey/followers",
"following_url": "https://api.github.com/users/chelsseeey/following{/other_user}",
"gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions",
"organizations_url": "https://api.github.com/users/chelsseeey/orgs",
"repos_url": "https://api.github.com/users/chelsseeey/repos",
"events_url": "https://api.github.com/users/chelsseeey/events{/privacy}",
"received_events_url": "https://api.github.com/users/chelsseeey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T05:05:05 | 2025-08-12T10:10:17 | 2025-08-12T10:10:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39571",
"html_url": "https://github.com/huggingface/transformers/pull/39571",
"diff_url": "https://github.com/huggingface/transformers/pull/39571.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39571.patch",
"merged_at": null
} | # What does this PR do?
Translated the `auto_docstring.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? -->
| {
"login": "chelsseeey",
"id": 152389483,
"node_id": "U_kgDOCRVHaw",
"avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chelsseeey",
"html_url": "https://github.com/chelsseeey",
"followers_url": "https://api.github.com/users/chelsseeey/followers",
"following_url": "https://api.github.com/users/chelsseeey/following{/other_user}",
"gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions",
"organizations_url": "https://api.github.com/users/chelsseeey/orgs",
"repos_url": "https://api.github.com/users/chelsseeey/repos",
"events_url": "https://api.github.com/users/chelsseeey/events{/privacy}",
"received_events_url": "https://api.github.com/users/chelsseeey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39571/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/39571/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39570 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39570/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39570/comments | https://api.github.com/repos/huggingface/transformers/issues/39570/events | https://github.com/huggingface/transformers/pull/39570 | 3,251,077,000 | PR_kwDOCUB6oc6f_Fk5 | 39,570 | Fix Qwen2-VL image/video processor legacy size field handling | {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T05:05:00 | 2025-07-25T02:56:20 | 2025-07-24T09:30:23 | COLLABORATOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39570",
"html_url": "https://github.com/huggingface/transformers/pull/39570",
"diff_url": "https://github.com/huggingface/transformers/pull/39570.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39570.patch",
"merged_at": null
} | # What does this PR do?
- Fixes https://github.com/vllm-project/vllm/issues/15614#issuecomment-3097032491
Some qwen2-vl based model like `ByteDance-Seed/UI-TARS-2B-SFT` has legacy `size` field in `preprocessor.json` like:
```
"size": {
"max_pixels": 2116800,
"min_pixels": 3136
}
```
Which will raise `ValueError: size must contain 'shortest_edge' and 'longest_edge' keys.` after v4.52
This PR adds backwards compatibility for this kind of `size` field.
**Code to reproduce**
```python3
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("ByteDance-Seed/UI-TARS-2B-SFT")
print(processor)
processor = AutoProcessor.from_pretrained("ByteDance-Seed/UI-TARS-2B-SFT", use_fast=True)
print(processor)
```
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Isotr0py",
"id": 41363108,
"node_id": "MDQ6VXNlcjQxMzYzMTA4",
"avatar_url": "https://avatars.githubusercontent.com/u/41363108?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Isotr0py",
"html_url": "https://github.com/Isotr0py",
"followers_url": "https://api.github.com/users/Isotr0py/followers",
"following_url": "https://api.github.com/users/Isotr0py/following{/other_user}",
"gists_url": "https://api.github.com/users/Isotr0py/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Isotr0py/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Isotr0py/subscriptions",
"organizations_url": "https://api.github.com/users/Isotr0py/orgs",
"repos_url": "https://api.github.com/users/Isotr0py/repos",
"events_url": "https://api.github.com/users/Isotr0py/events{/privacy}",
"received_events_url": "https://api.github.com/users/Isotr0py/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39570/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39570/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39569 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39569/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39569/comments | https://api.github.com/repos/huggingface/transformers/issues/39569/events | https://github.com/huggingface/transformers/pull/39569 | 3,251,048,958 | PR_kwDOCUB6oc6f-_XK | 39,569 | [i18n-KO] Translated `<auto_docstring>.md` to Korean | {
"login": "chelsseeey",
"id": 152389483,
"node_id": "U_kgDOCRVHaw",
"avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chelsseeey",
"html_url": "https://github.com/chelsseeey",
"followers_url": "https://api.github.com/users/chelsseeey/followers",
"following_url": "https://api.github.com/users/chelsseeey/following{/other_user}",
"gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions",
"organizations_url": "https://api.github.com/users/chelsseeey/orgs",
"repos_url": "https://api.github.com/users/chelsseeey/repos",
"events_url": "https://api.github.com/users/chelsseeey/events{/privacy}",
"received_events_url": "https://api.github.com/users/chelsseeey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T04:52:56 | 2025-07-22T04:57:20 | 2025-07-22T04:57:19 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39569",
"html_url": "https://github.com/huggingface/transformers/pull/39569",
"diff_url": "https://github.com/huggingface/transformers/pull/39569.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39569.patch",
"merged_at": null
} | # What does this PR do?
Translated the `<auto_docstring>.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | {
"login": "chelsseeey",
"id": 152389483,
"node_id": "U_kgDOCRVHaw",
"avatar_url": "https://avatars.githubusercontent.com/u/152389483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chelsseeey",
"html_url": "https://github.com/chelsseeey",
"followers_url": "https://api.github.com/users/chelsseeey/followers",
"following_url": "https://api.github.com/users/chelsseeey/following{/other_user}",
"gists_url": "https://api.github.com/users/chelsseeey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chelsseeey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chelsseeey/subscriptions",
"organizations_url": "https://api.github.com/users/chelsseeey/orgs",
"repos_url": "https://api.github.com/users/chelsseeey/repos",
"events_url": "https://api.github.com/users/chelsseeey/events{/privacy}",
"received_events_url": "https://api.github.com/users/chelsseeey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39569/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39569/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39568 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39568/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39568/comments | https://api.github.com/repos/huggingface/transformers/issues/39568/events | https://github.com/huggingface/transformers/pull/39568 | 3,251,041,797 | PR_kwDOCUB6oc6f-9yi | 39,568 | Feature/standardize opt model card | {
"login": "JoestarGagan",
"id": 151487971,
"node_id": "U_kgDOCQeF4w",
"avatar_url": "https://avatars.githubusercontent.com/u/151487971?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JoestarGagan",
"html_url": "https://github.com/JoestarGagan",
"followers_url": "https://api.github.com/users/JoestarGagan/followers",
"following_url": "https://api.github.com/users/JoestarGagan/following{/other_user}",
"gists_url": "https://api.github.com/users/JoestarGagan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JoestarGagan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JoestarGagan/subscriptions",
"organizations_url": "https://api.github.com/users/JoestarGagan/orgs",
"repos_url": "https://api.github.com/users/JoestarGagan/repos",
"events_url": "https://api.github.com/users/JoestarGagan/events{/privacy}",
"received_events_url": "https://api.github.com/users/JoestarGagan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T04:49:05 | 2025-07-23T17:57:48 | 2025-07-23T17:57:48 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39568",
"html_url": "https://github.com/huggingface/transformers/pull/39568",
"diff_url": "https://github.com/huggingface/transformers/pull/39568.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39568.patch",
"merged_at": "2025-07-23T17:57:48"
} | # What does this PR do?
This PR standardizes and enhances the documentation for the OPT model card ('docs/source/en/model_doc/opt.md'). The goal is to align it with the updated model card template (as mentioned in issue-Model cards #36979) .
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
* **No, this was a self-initiated effort to standardize the OPT model card based on the general contribution guidelines for documentation, as it is a straightforward improvement. **
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
* **N/A - This is a documentation-only change to a markdown file. **
## Key changes included:
* Aligned the model card to the current Hugging face template.
* Integrated new details regarding the quantization capabilities .
* Added specific, verified notes to clarify model behavior.
* Successfully passed `make style` and `make docs` locally to ensure formatting and documentation build integrity.
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@stevhliu (as this is a documentation update)
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39568/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39568/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39567 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39567/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39567/comments | https://api.github.com/repos/huggingface/transformers/issues/39567/events | https://github.com/huggingface/transformers/issues/39567 | 3,250,573,382 | I_kwDOCUB6oc7Bv9BG | 39,567 | Clarification on Recent Changes to Loss and Gradient Accumulation | {
"login": "jiosephlee",
"id": 43046526,
"node_id": "MDQ6VXNlcjQzMDQ2NTI2",
"avatar_url": "https://avatars.githubusercontent.com/u/43046526?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jiosephlee",
"html_url": "https://github.com/jiosephlee",
"followers_url": "https://api.github.com/users/jiosephlee/followers",
"following_url": "https://api.github.com/users/jiosephlee/following{/other_user}",
"gists_url": "https://api.github.com/users/jiosephlee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jiosephlee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jiosephlee/subscriptions",
"organizations_url": "https://api.github.com/users/jiosephlee/orgs",
"repos_url": "https://api.github.com/users/jiosephlee/repos",
"events_url": "https://api.github.com/users/jiosephlee/events{/privacy}",
"received_events_url": "https://api.github.com/users/jiosephlee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-22T01:06:09 | 2025-09-02T12:35:38 | 2025-07-22T12:00:49 | NONE | null | null | null | null | Hi!
I've been loosely following the recent conversations on bugs/issues such as https://github.com/huggingface/transformers/pull/34198 https://github.com/huggingface/transformers/pull/34191. As a lay user, it's not entirely clear to me what the issue is.
To hone in on specific questions, as someone who wants to use Trainer with a custom loss function, I'm concerned that there are numerous factors I need to account for, which leads to the following questions:
1. When providing a custom `compute_loss_func`, Is the expectation to divide the loss by num_items_in_batch (batch_size * gradient accumulation)? To confirm my understanding, the gradient accumulation is handled by just breaking up each "step" of the "effective batch size" into smaller steps, so the outputs and labels provided already account for gradient accumulation, and the loss just needs to be divided by num_items_in_batch.
2. I'm seeing open issues, such as https://github.com/huggingface/transformers/issues/38837, which appear to be unique to the last step when the gradient accumulation isn't nicely divided. Is this still an issue?
3. Do any of these dynamics change in a multi-GPU setup?
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39567/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39567/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39566 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39566/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39566/comments | https://api.github.com/repos/huggingface/transformers/issues/39566/events | https://github.com/huggingface/transformers/pull/39566 | 3,250,171,289 | PR_kwDOCUB6oc6f79ol | 39,566 | Ignore decoder_inputs_embeds when decoder_input_ids are present (fixes #39542) | {
"login": "JoseAlvarezMedina",
"id": 62663562,
"node_id": "MDQ6VXNlcjYyNjYzNTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/62663562?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JoseAlvarezMedina",
"html_url": "https://github.com/JoseAlvarezMedina",
"followers_url": "https://api.github.com/users/JoseAlvarezMedina/followers",
"following_url": "https://api.github.com/users/JoseAlvarezMedina/following{/other_user}",
"gists_url": "https://api.github.com/users/JoseAlvarezMedina/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JoseAlvarezMedina/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JoseAlvarezMedina/subscriptions",
"organizations_url": "https://api.github.com/users/JoseAlvarezMedina/orgs",
"repos_url": "https://api.github.com/users/JoseAlvarezMedina/repos",
"events_url": "https://api.github.com/users/JoseAlvarezMedina/events{/privacy}",
"received_events_url": "https://api.github.com/users/JoseAlvarezMedina/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T22:09:47 | 2025-07-22T11:54:55 | 2025-07-22T11:54:55 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39566",
"html_url": "https://github.com/huggingface/transformers/pull/39566",
"diff_url": "https://github.com/huggingface/transformers/pull/39566.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39566.patch",
"merged_at": null
} | Fixes #39542
### What does this PR do?
When using a custom seq2seq model together with PEFT/LoRA, both `decoder_input_ids` and `decoder_inputs_embeds` can end up being passed to the underlying decoder. This triggers the internal validation:
ValueError: You cannot specify both decoder_input_ids and decoder_inputs_embeds at the same time
This PR adds a small defensive override in `Seq2SeqTrainer.compute_loss` that drops `decoder_inputs_embeds` when `decoder_input_ids` are also present. This keeps backward compatibility and mirrors the user expectation that `decoder_input_ids` should take precedence.
### Implementation details
```python
if "decoder_input_ids" in inputs and "decoder_inputs_embeds" in inputs:
inputs.pop("decoder_inputs_embeds")
A new test tests/trainer_seq2seq_test.py builds a small Marian model, simulates the conflicting inputs and verifies that training progresses without raising the exception.
Motivation
This pattern appears in real-world usage when composing an encoder from one model and a Marian (or similar) decoder while applying PEFT/LoRA. Making the Trainer resilient avoids forcing each user to patch their own subclass.
Tests
pytest tests/trainer_seq2seq_test.py -q passes locally.
Additional notes
Happy to adjust the location of the guard (e.g. move it to the base Trainer) if reviewers prefer.
Who can review?
Tagging trainer maintainers for visibility: @zach-huggingface @SunMarc
| {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39566/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39566/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39565 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39565/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39565/comments | https://api.github.com/repos/huggingface/transformers/issues/39565/events | https://github.com/huggingface/transformers/issues/39565 | 3,250,134,716 | I_kwDOCUB6oc7BuR68 | 39,565 | Model forward execution in full eager mode? | {
"login": "22quinn",
"id": 33176974,
"node_id": "MDQ6VXNlcjMzMTc2OTc0",
"avatar_url": "https://avatars.githubusercontent.com/u/33176974?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/22quinn",
"html_url": "https://github.com/22quinn",
"followers_url": "https://api.github.com/users/22quinn/followers",
"following_url": "https://api.github.com/users/22quinn/following{/other_user}",
"gists_url": "https://api.github.com/users/22quinn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/22quinn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/22quinn/subscriptions",
"organizations_url": "https://api.github.com/users/22quinn/orgs",
"repos_url": "https://api.github.com/users/22quinn/repos",
"events_url": "https://api.github.com/users/22quinn/events{/privacy}",
"received_events_url": "https://api.github.com/users/22quinn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T21:49:05 | 2025-08-21T08:34:59 | 2025-08-21T08:34:58 | NONE | null | null | null | null | I know there is a flag `attn_implementation` which could trigger specialized attention kernel implementation. Besides this, does everything run in native PyTorch eager mode? Does `transformers` have any other custom op or kernel?
```python
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.1-8B", device_map="auto", torch_dtype=torch.bfloat16, attn_implementation=None)
model.forward(input_tokens)
```
I'm asking this to see if `transformers` can be used as a numerical baseline to verify other inference backend | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39565/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39565/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39564 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39564/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39564/comments | https://api.github.com/repos/huggingface/transformers/issues/39564/events | https://github.com/huggingface/transformers/pull/39564 | 3,249,960,065 | PR_kwDOCUB6oc6f7PRo | 39,564 | Fix auto_docstring crashing when dependencies are missing | {
"login": "yonigozlan",
"id": 74535834,
"node_id": "MDQ6VXNlcjc0NTM1ODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/74535834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yonigozlan",
"html_url": "https://github.com/yonigozlan",
"followers_url": "https://api.github.com/users/yonigozlan/followers",
"following_url": "https://api.github.com/users/yonigozlan/following{/other_user}",
"gists_url": "https://api.github.com/users/yonigozlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yonigozlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yonigozlan/subscriptions",
"organizations_url": "https://api.github.com/users/yonigozlan/orgs",
"repos_url": "https://api.github.com/users/yonigozlan/repos",
"events_url": "https://api.github.com/users/yonigozlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/yonigozlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T20:34:35 | 2025-07-25T17:19:24 | 2025-07-25T17:19:24 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39564",
"html_url": "https://github.com/huggingface/transformers/pull/39564",
"diff_url": "https://github.com/huggingface/transformers/pull/39564.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39564.patch",
"merged_at": "2025-07-25T17:19:24"
} | As the title says, simply use a try except for now until I come up with a better solution | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39564/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39564/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39563 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39563/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39563/comments | https://api.github.com/repos/huggingface/transformers/issues/39563/events | https://github.com/huggingface/transformers/pull/39563 | 3,249,768,781 | PR_kwDOCUB6oc6f6ku- | 39,563 | 🌐 [i18n-KO] Translated `vision-encoder-decoder.md` to Korean | {
"login": "jihyun-0611",
"id": 78160653,
"node_id": "MDQ6VXNlcjc4MTYwNjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/78160653?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jihyun-0611",
"html_url": "https://github.com/jihyun-0611",
"followers_url": "https://api.github.com/users/jihyun-0611/followers",
"following_url": "https://api.github.com/users/jihyun-0611/following{/other_user}",
"gists_url": "https://api.github.com/users/jihyun-0611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jihyun-0611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jihyun-0611/subscriptions",
"organizations_url": "https://api.github.com/users/jihyun-0611/orgs",
"repos_url": "https://api.github.com/users/jihyun-0611/repos",
"events_url": "https://api.github.com/users/jihyun-0611/events{/privacy}",
"received_events_url": "https://api.github.com/users/jihyun-0611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-21T19:24:23 | 2025-08-08T19:07:54 | null | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39563",
"html_url": "https://github.com/huggingface/transformers/pull/39563",
"diff_url": "https://github.com/huggingface/transformers/pull/39563.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39563.patch",
"merged_at": null
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `vision-encoder-decoder.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39563/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39563/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39562 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39562/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39562/comments | https://api.github.com/repos/huggingface/transformers/issues/39562/events | https://github.com/huggingface/transformers/pull/39562 | 3,249,597,757 | PR_kwDOCUB6oc6f5_So | 39,562 | Fixes needed for n-d parallelism and TP | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T18:15:16 | 2025-07-22T10:25:29 | 2025-07-22T10:25:00 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39562",
"html_url": "https://github.com/huggingface/transformers/pull/39562",
"diff_url": "https://github.com/huggingface/transformers/pull/39562.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39562.patch",
"merged_at": "2025-07-22T10:25:00"
} | # What does this PR do?
related: https://github.com/huggingface/transformers/pull/38949
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
@S1ro1 @SunMarc
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39562/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39562/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39561 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39561/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39561/comments | https://api.github.com/repos/huggingface/transformers/issues/39561/events | https://github.com/huggingface/transformers/pull/39561 | 3,249,577,351 | PR_kwDOCUB6oc6f561D | 39,561 | [`CI`] Fix post merge ernie 4.5 | {
"login": "vasqu",
"id": 73884904,
"node_id": "MDQ6VXNlcjczODg0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/73884904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vasqu",
"html_url": "https://github.com/vasqu",
"followers_url": "https://api.github.com/users/vasqu/followers",
"following_url": "https://api.github.com/users/vasqu/following{/other_user}",
"gists_url": "https://api.github.com/users/vasqu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vasqu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vasqu/subscriptions",
"organizations_url": "https://api.github.com/users/vasqu/orgs",
"repos_url": "https://api.github.com/users/vasqu/repos",
"events_url": "https://api.github.com/users/vasqu/events{/privacy}",
"received_events_url": "https://api.github.com/users/vasqu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T18:06:58 | 2025-07-21T18:56:26 | 2025-07-21T18:56:24 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39561",
"html_url": "https://github.com/huggingface/transformers/pull/39561",
"diff_url": "https://github.com/huggingface/transformers/pull/39561.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39561.patch",
"merged_at": "2025-07-21T18:56:24"
} | #39339 was merged right before the Ernie 4.5 models and I didn't check with main again :/ fixes the repo consistency based on that
cc @molbap @ArthurZucker | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39561/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 1,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39561/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39560 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39560/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39560/comments | https://api.github.com/repos/huggingface/transformers/issues/39560/events | https://github.com/huggingface/transformers/pull/39560 | 3,249,339,351 | PR_kwDOCUB6oc6f5GRf | 39,560 | fix load_model_end = true work when save_steps < eval_steps | {
"login": "ved1beta",
"id": 146507396,
"node_id": "U_kgDOCLuGhA",
"avatar_url": "https://avatars.githubusercontent.com/u/146507396?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ved1beta",
"html_url": "https://github.com/ved1beta",
"followers_url": "https://api.github.com/users/ved1beta/followers",
"following_url": "https://api.github.com/users/ved1beta/following{/other_user}",
"gists_url": "https://api.github.com/users/ved1beta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ved1beta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ved1beta/subscriptions",
"organizations_url": "https://api.github.com/users/ved1beta/orgs",
"repos_url": "https://api.github.com/users/ved1beta/repos",
"events_url": "https://api.github.com/users/ved1beta/events{/privacy}",
"received_events_url": "https://api.github.com/users/ved1beta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-21T16:42:48 | 2025-07-28T03:12:34 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39560",
"html_url": "https://github.com/huggingface/transformers/pull/39560",
"diff_url": "https://github.com/huggingface/transformers/pull/39560.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39560.patch",
"merged_at": null
} |
# What does this PR do?
model checkpoint tracking and rotation in the Trainer when save_steps and eval_steps are misaligned, addressing issue #39476
## Before submitting
- [ ] Did you write any new necessary tests?
@SunMarc
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39560/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39560/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39559 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39559/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39559/comments | https://api.github.com/repos/huggingface/transformers/issues/39559/events | https://github.com/huggingface/transformers/pull/39559 | 3,249,319,402 | PR_kwDOCUB6oc6f5Byq | 39,559 | 🌐 [i18n-KO] Translated `main_classes/deepspeed.md` to Korean | {
"login": "ChoHyoungSeo",
"id": 47881681,
"node_id": "MDQ6VXNlcjQ3ODgxNjgx",
"avatar_url": "https://avatars.githubusercontent.com/u/47881681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChoHyoungSeo",
"html_url": "https://github.com/ChoHyoungSeo",
"followers_url": "https://api.github.com/users/ChoHyoungSeo/followers",
"following_url": "https://api.github.com/users/ChoHyoungSeo/following{/other_user}",
"gists_url": "https://api.github.com/users/ChoHyoungSeo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChoHyoungSeo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChoHyoungSeo/subscriptions",
"organizations_url": "https://api.github.com/users/ChoHyoungSeo/orgs",
"repos_url": "https://api.github.com/users/ChoHyoungSeo/repos",
"events_url": "https://api.github.com/users/ChoHyoungSeo/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChoHyoungSeo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-21T16:37:18 | 2025-08-10T19:11:47 | null | NONE | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39559",
"html_url": "https://github.com/huggingface/transformers/pull/39559",
"diff_url": "https://github.com/huggingface/transformers/pull/39559.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39559.patch",
"merged_at": null
} | # What does this PR do?
Translated the `main_classes/deepspeed.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
<!-- @4N3MONE, @yijun-lee, @jungnerd , @harheem -->
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39559/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39559/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39558 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39558/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39558/comments | https://api.github.com/repos/huggingface/transformers/issues/39558/events | https://github.com/huggingface/transformers/issues/39558 | 3,248,780,402 | I_kwDOCUB6oc7BpHRy | 39,558 | Backwards incompatible change in returned hidden states | {
"login": "BenjaminBossan",
"id": 6229650,
"node_id": "MDQ6VXNlcjYyMjk2NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6229650?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BenjaminBossan",
"html_url": "https://github.com/BenjaminBossan",
"followers_url": "https://api.github.com/users/BenjaminBossan/followers",
"following_url": "https://api.github.com/users/BenjaminBossan/following{/other_user}",
"gists_url": "https://api.github.com/users/BenjaminBossan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BenjaminBossan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BenjaminBossan/subscriptions",
"organizations_url": "https://api.github.com/users/BenjaminBossan/orgs",
"repos_url": "https://api.github.com/users/BenjaminBossan/repos",
"events_url": "https://api.github.com/users/BenjaminBossan/events{/privacy}",
"received_events_url": "https://api.github.com/users/BenjaminBossan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-21T13:57:27 | 2025-07-25T16:41:22 | 2025-07-25T16:41:22 | MEMBER | null | null | null | null | ### System Info
Independent of system, requires latest transformers from main.
### Who can help?
As discussed internally @ArthurZucker @ydshieh @LysandreJik
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
The model outputs returned by passing `return_hidden_states=True` have changed recently. In particular, the order of the hidden states is affected by that change. This is a potentially severe issue, because on the surface, everything looks fine (same number of outputs, same shapes of outputs) but the indices are off by 1. As a user, it would be very hard to figure out what went wrong here, as there can be so many possible sources of why the output could have changed and because there was no deprecation period.
To reproduce the issue, run the following script, once with the current main of transformers and once with the last release, v4.53.2:
```python
import torch
import transformers
from transformers import AutoModelForCausalLM
print(f"{transformers.__version__=}")
model_id = "facebook/opt-125m" # OPT works
model_id = "meta-llama/Llama-3.2-1B" # Llama is broken
model = AutoModelForCausalLM.from_pretrained(model_id)
inputs = torch.arange(10).view(1, -1)
with torch.inference_mode():
output = model(inputs, output_hidden_states=True).hidden_states
print(f"Number of returned hidden states: {len(output)}")
for i, hs in enumerate(output):
print(f"hidden state {i:>2} values (slice): {hs[0, 0, :3]}")
```
The results are for main:
```
transformers.__version__='4.54.0.dev0'
Number of returned hidden states: 17
hidden state 0 values (slice): tensor([-0.0010, 0.0339, 0.0018])
hidden state 1 values (slice): tensor([ 0.2473, -0.5012, 1.5765])
hidden state 2 values (slice): tensor([ 0.2444, -0.5238, 1.5824])
hidden state 3 values (slice): tensor([ 0.2463, -0.5339, 1.5775])
hidden state 4 values (slice): tensor([ 0.2797, -0.5208, 1.6037])
hidden state 5 values (slice): tensor([ 0.2424, -0.4786, 1.6114])
hidden state 6 values (slice): tensor([ 0.2291, -0.4361, 1.5910])
hidden state 7 values (slice): tensor([ 0.2474, -0.3626, 1.5842])
hidden state 8 values (slice): tensor([ 0.2487, -0.3277, 1.4880])
hidden state 9 values (slice): tensor([ 0.1552, -0.2746, 1.4607])
hidden state 10 values (slice): tensor([ 0.1235, -0.2442, 1.3794])
hidden state 11 values (slice): tensor([ 0.1107, -0.2346, 1.3300])
hidden state 12 values (slice): tensor([ 0.0789, -0.2590, 1.3076])
hidden state 13 values (slice): tensor([ 0.3292, -0.3711, 1.1052])
hidden state 14 values (slice): tensor([ 0.4628, -0.6839, 1.1164])
hidden state 15 values (slice): tensor([ 0.4699, -0.7124, 4.8454])
hidden state 16 values (slice): tensor([ 0.2919, -0.4089, 1.8382])
```
v4.53.2
```
transformers.__version__='4.53.2'
Number of returned hidden states: 17
hidden state 0 values (slice): tensor([0.0045, 0.0166, 0.0210])
hidden state 1 values (slice): tensor([-0.0010, 0.0339, 0.0018])
hidden state 2 values (slice): tensor([ 0.2473, -0.5012, 1.5765])
hidden state 3 values (slice): tensor([ 0.2444, -0.5238, 1.5824])
hidden state 4 values (slice): tensor([ 0.2463, -0.5339, 1.5775])
hidden state 5 values (slice): tensor([ 0.2797, -0.5208, 1.6037])
hidden state 6 values (slice): tensor([ 0.2424, -0.4786, 1.6114])
hidden state 7 values (slice): tensor([ 0.2291, -0.4361, 1.5910])
hidden state 8 values (slice): tensor([ 0.2474, -0.3626, 1.5842])
hidden state 9 values (slice): tensor([ 0.2487, -0.3277, 1.4880])
hidden state 10 values (slice): tensor([ 0.1552, -0.2746, 1.4607])
hidden state 11 values (slice): tensor([ 0.1235, -0.2442, 1.3794])
hidden state 12 values (slice): tensor([ 0.1107, -0.2346, 1.3300])
hidden state 13 values (slice): tensor([ 0.0789, -0.2590, 1.3076])
hidden state 14 values (slice): tensor([ 0.3292, -0.3711, 1.1052])
hidden state 15 values (slice): tensor([ 0.4628, -0.6839, 1.1164])
hidden state 16 values (slice): tensor([ 0.2919, -0.4089, 1.8382])
```
As can be seen, the results are _mostly_ off by one, i.e. index `i` for main corresponds to index `i+1` for v4.53.2. However, the last index is identical for both, the 0th index of v4.53.2 is not found in main, and the second to last index of main is not found in v4.53.2.
The exact PR causing this change is #39120, which can be verified by checking out its commit (`ca7e1a3756c022bf31429c452b2f313f043f32de`) and comparing the results to the previous commit (`e6a8063ef1af16df964b644b07e1d17e96555d23`).
When replacing Llama with OPT, the error goes away, the hidden states remain the same between transformers versions. This is probably because the embedding is part of the decoder block in OPT but outside the decoder block of Llama.
### Expected behavior
The returned hidden states should not change between transformers versions. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39558/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39558/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39557 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39557/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39557/comments | https://api.github.com/repos/huggingface/transformers/issues/39557/events | https://github.com/huggingface/transformers/pull/39557 | 3,248,752,593 | PR_kwDOCUB6oc6f3FFh | 39,557 | 🌐 [i18n-KO] Translated `imageprocessor.md` to Korean | {
"login": "HyunZ118",
"id": 156191095,
"node_id": "U_kgDOCU9Jdw",
"avatar_url": "https://avatars.githubusercontent.com/u/156191095?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HyunZ118",
"html_url": "https://github.com/HyunZ118",
"followers_url": "https://api.github.com/users/HyunZ118/followers",
"following_url": "https://api.github.com/users/HyunZ118/following{/other_user}",
"gists_url": "https://api.github.com/users/HyunZ118/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HyunZ118/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HyunZ118/subscriptions",
"organizations_url": "https://api.github.com/users/HyunZ118/orgs",
"repos_url": "https://api.github.com/users/HyunZ118/repos",
"events_url": "https://api.github.com/users/HyunZ118/events{/privacy}",
"received_events_url": "https://api.github.com/users/HyunZ118/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T13:49:36 | 2025-09-15T17:07:16 | 2025-09-15T17:07:16 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39557",
"html_url": "https://github.com/huggingface/transformers/pull/39557",
"diff_url": "https://github.com/huggingface/transformers/pull/39557.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39557.patch",
"merged_at": "2025-09-15T17:07:16"
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `imageprocessor.md` file of the documentation to Korean.
I will rebase and update the `_toctree.yml` file once the update PR is merged.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [ ] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
<!-- @stevhliu May you please review this PR? --> | {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39557/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39557/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39556 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39556/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39556/comments | https://api.github.com/repos/huggingface/transformers/issues/39556/events | https://github.com/huggingface/transformers/pull/39556 | 3,248,596,928 | PR_kwDOCUB6oc6f2iz0 | 39,556 | bad_words_ids no longer slow on mps | {
"login": "DWarez",
"id": 10366381,
"node_id": "MDQ6VXNlcjEwMzY2Mzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/10366381?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DWarez",
"html_url": "https://github.com/DWarez",
"followers_url": "https://api.github.com/users/DWarez/followers",
"following_url": "https://api.github.com/users/DWarez/following{/other_user}",
"gists_url": "https://api.github.com/users/DWarez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DWarez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DWarez/subscriptions",
"organizations_url": "https://api.github.com/users/DWarez/orgs",
"repos_url": "https://api.github.com/users/DWarez/repos",
"events_url": "https://api.github.com/users/DWarez/events{/privacy}",
"received_events_url": "https://api.github.com/users/DWarez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T13:05:23 | 2025-07-25T17:45:45 | 2025-07-25T17:45:42 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39556",
"html_url": "https://github.com/huggingface/transformers/pull/39556",
"diff_url": "https://github.com/huggingface/transformers/pull/39556.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39556.patch",
"merged_at": "2025-07-25T17:45:42"
} | # What does this PR do?
Using the `bad_words_ids` on mps is slowing down a lot the text generation, this PR tries to address that.
Fixes https://github.com/huggingface/transformers/issues/39512
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
@ArthurZucker
| {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39556/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39556/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39555 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39555/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39555/comments | https://api.github.com/repos/huggingface/transformers/issues/39555/events | https://github.com/huggingface/transformers/pull/39555 | 3,248,574,082 | PR_kwDOCUB6oc6f2d3u | 39,555 | [WIP] try to relax the tie_weights method | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/followers",
"following_url": "https://api.github.com/users/molbap/following{/other_user}",
"gists_url": "https://api.github.com/users/molbap/gists{/gist_id}",
"starred_url": "https://api.github.com/users/molbap/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/molbap/subscriptions",
"organizations_url": "https://api.github.com/users/molbap/orgs",
"repos_url": "https://api.github.com/users/molbap/repos",
"events_url": "https://api.github.com/users/molbap/events{/privacy}",
"received_events_url": "https://api.github.com/users/molbap/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-21T12:58:15 | 2025-07-21T13:11:12 | null | CONTRIBUTOR | null | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39555",
"html_url": "https://github.com/huggingface/transformers/pull/39555",
"diff_url": "https://github.com/huggingface/transformers/pull/39555.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39555.patch",
"merged_at": null
} | # What does this PR do?
as per title, trials to rework parts of inits and what is done at weight tying. Might be superseded by #39339
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39555/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39555/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39554 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39554/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39554/comments | https://api.github.com/repos/huggingface/transformers/issues/39554/events | https://github.com/huggingface/transformers/issues/39554 | 3,248,397,239 | I_kwDOCUB6oc7Bnpu3 | 39,554 | Why `is_causal` is not used in `flash_attention_forward` ? | {
"login": "lucaswychan",
"id": 109060491,
"node_id": "U_kgDOBoAhiw",
"avatar_url": "https://avatars.githubusercontent.com/u/109060491?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lucaswychan",
"html_url": "https://github.com/lucaswychan",
"followers_url": "https://api.github.com/users/lucaswychan/followers",
"following_url": "https://api.github.com/users/lucaswychan/following{/other_user}",
"gists_url": "https://api.github.com/users/lucaswychan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lucaswychan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lucaswychan/subscriptions",
"organizations_url": "https://api.github.com/users/lucaswychan/orgs",
"repos_url": "https://api.github.com/users/lucaswychan/repos",
"events_url": "https://api.github.com/users/lucaswychan/events{/privacy}",
"received_events_url": "https://api.github.com/users/lucaswychan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 6202871275,
"node_id": "LA_kwDOCUB6oc8AAAABcbhN6w",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Flash%20Attention",
"name": "Flash Attention",
"color": "201FF8",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-21T12:08:00 | 2025-08-18T12:50:34 | 2025-08-18T12:50:34 | CONTRIBUTOR | null | null | null | null | I want to perform bidirectional attention in the Qwen3 model to train an embedding model, so I passed `is_causal=False` in the model `forward` (I manually added `is_causal` arguments in all `forward` method such as `Qwen3Model` and `Qwen3Attention` in`modeling_qwen3.py`):
```python
class Qwen3Attention(nn.Module):
"""Multi-headed attention from 'Attention Is All You Need' paper"""
...
def forward(
self,
hidden_states: torch.Tensor,
position_embeddings: tuple[torch.Tensor, torch.Tensor],
attention_mask: Optional[torch.Tensor],
past_key_value: Optional[Cache] = None,
cache_position: Optional[torch.LongTensor] = None,
is_causal: Optional[bool] = True, # I add is_causal here
**kwargs: Unpack[FlashAttentionKwargs],
) -> tuple[torch.Tensor, Optional[torch.Tensor], Optional[tuple[torch.Tensor]]]:
...
attn_output, attn_weights = attention_interface(
self,
query_states,
key_states,
value_states,
attention_mask,
dropout=0.0 if not self.training else self.attention_dropout,
scaling=self.scaling,
sliding_window=self.sliding_window, # diff with Llama
is_causal=is_causal, # and is_causal from the argument is passed to the attention_interface (e.g. `flash_attention_2`, `sdpa_attention_forward`)
**kwargs,
)
```
I can successfully change the causality of the attention in `sdpa_attention_forward`. However, I realized that it does not change the causality in the attention in `flash_attention_forward`. After diving into the implementation of `flash_attention_forward`, I found the reason in `flash_attention_forward` located at `transformers/integrations/flash_attention.py`:
```python
def flash_attention_forward(
module: torch.nn.Module,
query: torch.Tensor,
key: torch.Tensor,
value: torch.Tensor,
attention_mask: Optional[torch.Tensor],
dropout: float = 0.0,
scaling: Optional[float] = None,
sliding_window: Optional[int] = None,
softcap: Optional[float] = None,
**kwargs,
) -> tuple[torch.Tensor, None]:
...
# FA2 always relies on the value set in the module, so remove it if present in kwargs to avoid passing it twice
kwargs.pop("is_causal", None)
attn_output = _flash_attention_forward(
query,
key,
value,
attention_mask,
query_length=seq_len,
is_causal=module.is_causal, # here module is `Qwen3Attention`
dropout=dropout,
softmax_scale=scaling,
sliding_window=sliding_window,
softcap=softcap,
use_top_left_mask=_use_top_left_mask,
target_dtype=target_dtype,
attn_implementation=module.config._attn_implementation,
**kwargs,
)
```
As you can see, the `is_causal` argument is popped, and the `is_causal` of `Qwen3Attention` is used as the argument. Note that `Qwen3Attention.is_causal` is never changed, and its default value is `True`, so the `is_causal` argument passed into `_flash_attention_forward` will always be `True` regardless of any change.
After I add a line of code to alter the `Qwen3Attention.is_causal`, i.e. `self.is_causal = is_causal` before passing the arguments into `attention_interface`, I can change the causality of `flash_attention_forward`. So I would like to know if it is a feature or a bug? Thank you!! | {
"login": "lucaswychan",
"id": 109060491,
"node_id": "U_kgDOBoAhiw",
"avatar_url": "https://avatars.githubusercontent.com/u/109060491?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lucaswychan",
"html_url": "https://github.com/lucaswychan",
"followers_url": "https://api.github.com/users/lucaswychan/followers",
"following_url": "https://api.github.com/users/lucaswychan/following{/other_user}",
"gists_url": "https://api.github.com/users/lucaswychan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lucaswychan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lucaswychan/subscriptions",
"organizations_url": "https://api.github.com/users/lucaswychan/orgs",
"repos_url": "https://api.github.com/users/lucaswychan/repos",
"events_url": "https://api.github.com/users/lucaswychan/events{/privacy}",
"received_events_url": "https://api.github.com/users/lucaswychan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39554/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39554/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39553 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39553/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39553/comments | https://api.github.com/repos/huggingface/transformers/issues/39553/events | https://github.com/huggingface/transformers/pull/39553 | 3,248,376,588 | PR_kwDOCUB6oc6f1yGL | 39,553 | Fix Qwen Omni integration test | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T12:01:57 | 2025-07-21T12:14:37 | 2025-07-21T12:11:47 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39553",
"html_url": "https://github.com/huggingface/transformers/pull/39553",
"diff_url": "https://github.com/huggingface/transformers/pull/39553.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39553.patch",
"merged_at": "2025-07-21T12:11:47"
} | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this with a description of the change and which issue is fixed (if applicable). Please also include relevant motivation and context. List any dependencies (if any) that are required for this change.
Once you're done, someone will review your PR shortly (see the section "Who can review?" below to tag some potential reviewers). They may suggest changes to make the code even better. If no one reviewed your PR after a week has passed, don't hesitate to post a new comment @-mentioning the same persons---sometimes notifications get lost.
-->
<!-- Remove if not applicable -->
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39553/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39553/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39552 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39552/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39552/comments | https://api.github.com/repos/huggingface/transformers/issues/39552/events | https://github.com/huggingface/transformers/pull/39552 | 3,248,324,544 | PR_kwDOCUB6oc6f1mm1 | 39,552 | 🌐 [i18n-KO] Translated `perf_train_gpu_one.md` to Korean | {
"login": "D15M4S",
"id": 122260287,
"node_id": "U_kgDOB0mLPw",
"avatar_url": "https://avatars.githubusercontent.com/u/122260287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/D15M4S",
"html_url": "https://github.com/D15M4S",
"followers_url": "https://api.github.com/users/D15M4S/followers",
"following_url": "https://api.github.com/users/D15M4S/following{/other_user}",
"gists_url": "https://api.github.com/users/D15M4S/gists{/gist_id}",
"starred_url": "https://api.github.com/users/D15M4S/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/D15M4S/subscriptions",
"organizations_url": "https://api.github.com/users/D15M4S/orgs",
"repos_url": "https://api.github.com/users/D15M4S/repos",
"events_url": "https://api.github.com/users/D15M4S/events{/privacy}",
"received_events_url": "https://api.github.com/users/D15M4S/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T11:43:49 | 2025-07-29T15:08:57 | 2025-07-29T15:08:57 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39552",
"html_url": "https://github.com/huggingface/transformers/pull/39552",
"diff_url": "https://github.com/huggingface/transformers/pull/39552.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39552.patch",
"merged_at": "2025-07-29T15:08:57"
} | <!-- PR의 제목은 "🌐 [i18n-KO] Translated `<your_file>.md` to Korean" 으로 부탁드립니다 -->
# What does this PR do?
Translated the `perf_train_gpu_one.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
<!-- 1. 위 체크가 모두 완료된 뒤에만 KREW 팀원들에게 리뷰를 요청하는 아래 주석을 노출해주세요!-->
May you please review this PR?
@4N3MONE, @yijun-lee, @jungnerd , @harheem
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR?
| {
"login": "stevhliu",
"id": 59462357,
"node_id": "MDQ6VXNlcjU5NDYyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/59462357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stevhliu",
"html_url": "https://github.com/stevhliu",
"followers_url": "https://api.github.com/users/stevhliu/followers",
"following_url": "https://api.github.com/users/stevhliu/following{/other_user}",
"gists_url": "https://api.github.com/users/stevhliu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stevhliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stevhliu/subscriptions",
"organizations_url": "https://api.github.com/users/stevhliu/orgs",
"repos_url": "https://api.github.com/users/stevhliu/repos",
"events_url": "https://api.github.com/users/stevhliu/events{/privacy}",
"received_events_url": "https://api.github.com/users/stevhliu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39552/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39552/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39550 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39550/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39550/comments | https://api.github.com/repos/huggingface/transformers/issues/39550/events | https://github.com/huggingface/transformers/pull/39550 | 3,248,157,603 | PR_kwDOCUB6oc6f1BFk | 39,550 | [docs] Create page on inference servers with transformers backend | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T10:56:42 | 2025-07-22T13:31:11 | 2025-07-22T13:31:10 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39550",
"html_url": "https://github.com/huggingface/transformers/pull/39550",
"diff_url": "https://github.com/huggingface/transformers/pull/39550.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39550.patch",
"merged_at": "2025-07-22T13:31:10"
} | # What does this PR do?
As per title, I added the basic info about existing inference engines so feel free to add more examples/tips etc. This PR creates a space where we can host docs on all third-party servers and we can submit PRs in vLLM/SGLang/TGI pointing to this page | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39550/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39550/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39549 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39549/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39549/comments | https://api.github.com/repos/huggingface/transformers/issues/39549/events | https://github.com/huggingface/transformers/issues/39549 | 3,247,999,183 | I_kwDOCUB6oc7BmIjP | 39,549 | Is there plan to integrate ColQwen2.5 into Transformers? | {
"login": "rebel-thkim",
"id": 157466331,
"node_id": "U_kgDOCWK-2w",
"avatar_url": "https://avatars.githubusercontent.com/u/157466331?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rebel-thkim",
"html_url": "https://github.com/rebel-thkim",
"followers_url": "https://api.github.com/users/rebel-thkim/followers",
"following_url": "https://api.github.com/users/rebel-thkim/following{/other_user}",
"gists_url": "https://api.github.com/users/rebel-thkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rebel-thkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rebel-thkim/subscriptions",
"organizations_url": "https://api.github.com/users/rebel-thkim/orgs",
"repos_url": "https://api.github.com/users/rebel-thkim/repos",
"events_url": "https://api.github.com/users/rebel-thkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/rebel-thkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | null | [] | 2025-07-21T10:08:47 | 2025-07-21T10:08:47 | null | NONE | null | null | null | null | ### Model description
Is ColQwen2ForRetrieval integrated into the transformers library, and are there plans to add [ColQwen2.5](https://github.com/illuin-tech/colpali/blob/main/colpali_engine/models/qwen2_5/colqwen2_5/modeling_colqwen2_5.py) in the future?
### Open source status
- [x] The model implementation is available
- [x] The model weights are available
### Provide useful links for the implementation
https://github.com/illuin-tech/colpali/blob/main/colpali_engine/models/qwen2_5/colqwen2_5/modeling_colqwen2_5.py
https://github.com/huggingface/transformers/pull/38391 | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39549/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
} | https://api.github.com/repos/huggingface/transformers/issues/39549/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
https://api.github.com/repos/huggingface/transformers/issues/39548 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39548/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39548/comments | https://api.github.com/repos/huggingface/transformers/issues/39548/events | https://github.com/huggingface/transformers/pull/39548 | 3,247,815,690 | PR_kwDOCUB6oc6fz3Zo | 39,548 | Fix the check in flex test | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T09:12:33 | 2025-07-21T11:58:30 | 2025-07-21T11:29:44 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39548",
"html_url": "https://github.com/huggingface/transformers/pull/39548",
"diff_url": "https://github.com/huggingface/transformers/pull/39548.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39548.patch",
"merged_at": "2025-07-21T11:29:44"
} | # What does this PR do?
As per the title. Otherwise it passes for some models where it should not, and it complains later. | {
"login": "Cyrilvallez",
"id": 71554963,
"node_id": "MDQ6VXNlcjcxNTU0OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/71554963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Cyrilvallez",
"html_url": "https://github.com/Cyrilvallez",
"followers_url": "https://api.github.com/users/Cyrilvallez/followers",
"following_url": "https://api.github.com/users/Cyrilvallez/following{/other_user}",
"gists_url": "https://api.github.com/users/Cyrilvallez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Cyrilvallez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Cyrilvallez/subscriptions",
"organizations_url": "https://api.github.com/users/Cyrilvallez/orgs",
"repos_url": "https://api.github.com/users/Cyrilvallez/repos",
"events_url": "https://api.github.com/users/Cyrilvallez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Cyrilvallez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39548/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39548/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39547 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39547/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39547/comments | https://api.github.com/repos/huggingface/transformers/issues/39547/events | https://github.com/huggingface/transformers/pull/39547 | 3,247,804,756 | PR_kwDOCUB6oc6fz0_M | 39,547 | [docs] update attention implementation and cache docs | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T09:09:22 | 2025-07-22T13:06:43 | 2025-07-22T13:06:43 | MEMBER | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39547",
"html_url": "https://github.com/huggingface/transformers/pull/39547",
"diff_url": "https://github.com/huggingface/transformers/pull/39547.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39547.patch",
"merged_at": "2025-07-22T13:06:43"
} | # What does this PR do?
As per title, and we will delete `Legacy cache format` section in 2-3 releases when all legacy support is removed.
This PR:
- Adds docs on multimodal attention implementation setting
- Cross references attn implementation docs in other pages. I didn't know that docs page existed and searching for FA2/SDPA usually doesn't return it. Let's make it more discoverable
- Adds a section on what is cache position, as reported by many users concept of `cache position` is still confusing. We can add later more examples, I remember @gante had a PR on fixing generate when cache position is provided by users :) | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/users/zucchini-nlp/followers",
"following_url": "https://api.github.com/users/zucchini-nlp/following{/other_user}",
"gists_url": "https://api.github.com/users/zucchini-nlp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zucchini-nlp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zucchini-nlp/subscriptions",
"organizations_url": "https://api.github.com/users/zucchini-nlp/orgs",
"repos_url": "https://api.github.com/users/zucchini-nlp/repos",
"events_url": "https://api.github.com/users/zucchini-nlp/events{/privacy}",
"received_events_url": "https://api.github.com/users/zucchini-nlp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39547/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39547/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39546 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39546/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39546/comments | https://api.github.com/repos/huggingface/transformers/issues/39546/events | https://github.com/huggingface/transformers/pull/39546 | 3,247,078,166 | PR_kwDOCUB6oc6fxVEI | 39,546 | Fix Docstring of BarkProcessor | {
"login": "st81",
"id": 58893365,
"node_id": "MDQ6VXNlcjU4ODkzMzY1",
"avatar_url": "https://avatars.githubusercontent.com/u/58893365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/st81",
"html_url": "https://github.com/st81",
"followers_url": "https://api.github.com/users/st81/followers",
"following_url": "https://api.github.com/users/st81/following{/other_user}",
"gists_url": "https://api.github.com/users/st81/gists{/gist_id}",
"starred_url": "https://api.github.com/users/st81/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/st81/subscriptions",
"organizations_url": "https://api.github.com/users/st81/orgs",
"repos_url": "https://api.github.com/users/st81/repos",
"events_url": "https://api.github.com/users/st81/events{/privacy}",
"received_events_url": "https://api.github.com/users/st81/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T03:43:22 | 2025-07-21T12:57:04 | 2025-07-21T12:56:44 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39546",
"html_url": "https://github.com/huggingface/transformers/pull/39546",
"diff_url": "https://github.com/huggingface/transformers/pull/39546.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39546.patch",
"merged_at": "2025-07-21T12:56:44"
} | # What does this PR do?
This PR corrects the return type hint in the docstring of `BarkProcessor.__call__`. The current docstring incorrectly stated that the method returns a `Tuple(BatchEncoding, BatchFeature)`, but it actually returns a `BatchEncoding` object.
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
- speech models: @eustlb
- Documentation: @stevhliu | {
"login": "Rocketknight1",
"id": 12866554,
"node_id": "MDQ6VXNlcjEyODY2NTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/12866554?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rocketknight1",
"html_url": "https://github.com/Rocketknight1",
"followers_url": "https://api.github.com/users/Rocketknight1/followers",
"following_url": "https://api.github.com/users/Rocketknight1/following{/other_user}",
"gists_url": "https://api.github.com/users/Rocketknight1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rocketknight1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rocketknight1/subscriptions",
"organizations_url": "https://api.github.com/users/Rocketknight1/orgs",
"repos_url": "https://api.github.com/users/Rocketknight1/repos",
"events_url": "https://api.github.com/users/Rocketknight1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rocketknight1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39546/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39546/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39545 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39545/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39545/comments | https://api.github.com/repos/huggingface/transformers/issues/39545/events | https://github.com/huggingface/transformers/issues/39545 | 3,247,007,816 | I_kwDOCUB6oc7BiWhI | 39,545 | Is the new Intel–Weizmann speculative decoding algorithm integrated into Transformers? | {
"login": "NEWbie0709",
"id": 81673708,
"node_id": "MDQ6VXNlcjgxNjczNzA4",
"avatar_url": "https://avatars.githubusercontent.com/u/81673708?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NEWbie0709",
"html_url": "https://github.com/NEWbie0709",
"followers_url": "https://api.github.com/users/NEWbie0709/followers",
"following_url": "https://api.github.com/users/NEWbie0709/following{/other_user}",
"gists_url": "https://api.github.com/users/NEWbie0709/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NEWbie0709/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NEWbie0709/subscriptions",
"organizations_url": "https://api.github.com/users/NEWbie0709/orgs",
"repos_url": "https://api.github.com/users/NEWbie0709/repos",
"events_url": "https://api.github.com/users/NEWbie0709/events{/privacy}",
"received_events_url": "https://api.github.com/users/NEWbie0709/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-21T02:47:48 | 2025-07-22T12:15:54 | 2025-07-21T12:47:08 | NONE | null | null | null | null | Hi,
I recently read about a new speculative decoding algorithm developed by Intel Labs and the Weizmann Institute, which reportedly improves inference speed by up to 2.8×, even when using draft and target models with different vocabularies or architectures.
References:
- [Intel Newsroom](https://newsroom.intel.com/artificial-intelligence/intel-weizmann-institute-speed-ai-with-speculative-decoding-advance?utm_source=chatgpt.com)
- [CTech Article](https://www.calcalistech.com/ctechnews/article/h1z7pydlex)
Several sources (including Intel press releases and third-party writeups) claim that this algorithm has already been integrated into the Hugging Face Transformers library.
However, I haven’t found any reference to this new version in the official Transformers documentation
My Questions:
1. Has this Intel–Weizmann speculative decoding algorithm actually been integrated into transformers?
2. If so, where can I find documentation or usage examples for how to enable it?
Thanks in advance for your help! This looks like a powerful advancement, and I'd love to test it. | {
"login": "NEWbie0709",
"id": 81673708,
"node_id": "MDQ6VXNlcjgxNjczNzA4",
"avatar_url": "https://avatars.githubusercontent.com/u/81673708?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NEWbie0709",
"html_url": "https://github.com/NEWbie0709",
"followers_url": "https://api.github.com/users/NEWbie0709/followers",
"following_url": "https://api.github.com/users/NEWbie0709/following{/other_user}",
"gists_url": "https://api.github.com/users/NEWbie0709/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NEWbie0709/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NEWbie0709/subscriptions",
"organizations_url": "https://api.github.com/users/NEWbie0709/orgs",
"repos_url": "https://api.github.com/users/NEWbie0709/repos",
"events_url": "https://api.github.com/users/NEWbie0709/events{/privacy}",
"received_events_url": "https://api.github.com/users/NEWbie0709/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39545/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39545/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39544 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39544/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39544/comments | https://api.github.com/repos/huggingface/transformers/issues/39544/events | https://github.com/huggingface/transformers/pull/39544 | 3,246,976,508 | PR_kwDOCUB6oc6fw_N- | 39,544 | 🌐 [i18n-KO] Translated feature_extractors.md to Korea | {
"login": "ssum21",
"id": 116950962,
"node_id": "U_kgDOBviHsg",
"avatar_url": "https://avatars.githubusercontent.com/u/116950962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ssum21",
"html_url": "https://github.com/ssum21",
"followers_url": "https://api.github.com/users/ssum21/followers",
"following_url": "https://api.github.com/users/ssum21/following{/other_user}",
"gists_url": "https://api.github.com/users/ssum21/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ssum21/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssum21/subscriptions",
"organizations_url": "https://api.github.com/users/ssum21/orgs",
"repos_url": "https://api.github.com/users/ssum21/repos",
"events_url": "https://api.github.com/users/ssum21/events{/privacy}",
"received_events_url": "https://api.github.com/users/ssum21/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | null | [] | null | [] | 2025-07-21T02:21:51 | 2025-10-16T18:23:24 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39544",
"html_url": "https://github.com/huggingface/transformers/pull/39544",
"diff_url": "https://github.com/huggingface/transformers/pull/39544.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39544.patch",
"merged_at": null
} | # What does this PR do?
Translated the `feature_extractors.md` file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before reviewing
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Check (맞춤법 검사)
- [x] Review or Add new terms to glossary (용어 확인 및 추가)
- [x] Check Inline TOC (e.g. `[[lowercased-header]]`)
- [x] Check live-preview for gotchas (live-preview로 정상작동 확인)
## Who can review? (Initial)
May you please review this PR?
@4N3MONE @jungnerd @harheem @yijun-lee
## Before submitting
- [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#start-contributing-pull-requests),
Pull Request section?
- [x] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [x] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [x] Did you write any new necessary tests?
## Who can review? (Final)
<!-- 2. KREW 팀원들의 리뷰가 끝난 후에 아래 주석을 노출해주세요! -->
@stevhliu May you please review this PR? | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39544/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39544/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39543 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39543/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39543/comments | https://api.github.com/repos/huggingface/transformers/issues/39543/events | https://github.com/huggingface/transformers/pull/39543 | 3,246,883,343 | PR_kwDOCUB6oc6fwrDk | 39,543 | added smollama base model - 1B parameter | {
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | null | [] | 2025-07-21T01:08:30 | 2025-07-29T14:34:32 | 2025-07-29T14:34:32 | NONE | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39543",
"html_url": "https://github.com/huggingface/transformers/pull/39543",
"diff_url": "https://github.com/huggingface/transformers/pull/39543.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39543.patch",
"merged_at": null
} | New base model added @ArthurZucker
smollama
-1B parameter model
-Modified llama architecture to use NoPE
-Tokoenizer from smollm3
| {
"login": "dustinwloring1988",
"id": 21135165,
"node_id": "MDQ6VXNlcjIxMTM1MTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/21135165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dustinwloring1988",
"html_url": "https://github.com/dustinwloring1988",
"followers_url": "https://api.github.com/users/dustinwloring1988/followers",
"following_url": "https://api.github.com/users/dustinwloring1988/following{/other_user}",
"gists_url": "https://api.github.com/users/dustinwloring1988/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dustinwloring1988/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dustinwloring1988/subscriptions",
"organizations_url": "https://api.github.com/users/dustinwloring1988/orgs",
"repos_url": "https://api.github.com/users/dustinwloring1988/repos",
"events_url": "https://api.github.com/users/dustinwloring1988/events{/privacy}",
"received_events_url": "https://api.github.com/users/dustinwloring1988/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39543/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39543/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39542 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39542/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39542/comments | https://api.github.com/repos/huggingface/transformers/issues/39542/events | https://github.com/huggingface/transformers/issues/39542 | 3,246,881,452 | I_kwDOCUB6oc7Bh3qs | 39,542 | ValueError: You cannot specify both decoder_input_ids and decoder_inputs_embeds at the same time | {
"login": "xjackzenvey",
"id": 124496973,
"node_id": "U_kgDOB2usTQ",
"avatar_url": "https://avatars.githubusercontent.com/u/124496973?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xjackzenvey",
"html_url": "https://github.com/xjackzenvey",
"followers_url": "https://api.github.com/users/xjackzenvey/followers",
"following_url": "https://api.github.com/users/xjackzenvey/following{/other_user}",
"gists_url": "https://api.github.com/users/xjackzenvey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xjackzenvey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xjackzenvey/subscriptions",
"organizations_url": "https://api.github.com/users/xjackzenvey/orgs",
"repos_url": "https://api.github.com/users/xjackzenvey/repos",
"events_url": "https://api.github.com/users/xjackzenvey/events{/privacy}",
"received_events_url": "https://api.github.com/users/xjackzenvey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 1834081910,
"node_id": "MDU6TGFiZWwxODM0MDgxOTEw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Usage",
"name": "Usage",
"color": "e28436",
"default": false,
"description": "General questions about the library"
},
{
"id": 1990918270,
"node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue",
"name": "Good First Issue",
"color": "bbf794",
"default": false,
"description": ""
},
{
"id": 2155169140,
"node_id": "MDU6TGFiZWwyMTU1MTY5MTQw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/trainer",
"name": "trainer",
"color": "2ef289",
"default": false,
"description": ""
},
{
"id": 3817266200,
"node_id": "MDU6TGFiZWwzODE3MjY2MjAw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": null
}
] | closed | false | null | [] | null | [] | 2025-07-21T01:06:27 | 2025-08-22T05:53:51 | 2025-08-21T13:13:16 | NONE | null | null | null | null | ### System Info
- `transformers` version: 4.53.2
- Platform: **Ubuntu 22.04** Linux 5.15.0-139-generic
- **Python 3.10.18** + ipykernel 6.29.5
- Pytorch 2.7.1+cu118
### Who can help?
@ArthurZucker
@SunMarc
### Information
- [ ] The official example scripts
- [x] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [x] My own task or dataset (give details below)
### Reproduction
 I want to build a new MT model with **bert-based encoder** and a **decoder from opus-mt-en-zh** (loaded as `MarianMTModel`), BUT when I execute `Trainer.train()`, It report ValueError: `You cannot specify both decoder_input_ids and decoder_inputs_embeds at the same time`. This is code about my model and trainer.
 Thanks for helping!
```Python
# ManchuBERT Encoder + Opus-MT-zh Decoder
import torch
from torch import nn
from transformers.modeling_outputs import Seq2SeqLMOutput
def get_extended_attention_mask(attention_mask, input_shape, device, dtype=torch.float32):
"""
attention_mask: [B, seq_len]
return: [B, 1, 1, seq_len]
"""
mask = attention_mask[:, None, None, :] # [B, 1, 1, seq_len]
mask = mask.to(dtype=dtype)
mask = (1.0 - mask) * -10000.0
return mask
class ManchuZhMT(nn.Module):
def __init__(self, bert, marian):
super().__init__()
self.decoder_embeddings = marian.model.decoder.embed_tokens
self.embeddings = bert.embeddings
self.encoder = bert.encoder
self.decoder = marian.model.decoder
self.lm_head = marian.lm_head
self.final_logits_bias = marian.final_logits_bias
self.config = marian.config
def forward(self,
input_ids=None,
attention_mask=None,
decoder_input_ids=None,
decoder_attention_mask=None,
labels=None,
**kwargs):
hidden_states = self.embeddings(input_ids=input_ids)
attention_mask = attention_mask.to(dtype=torch.float32)
extended_mask = get_extended_attention_mask(attention_mask, input_ids.shape, input_ids.device)
enc_out = self.encoder(hidden_states=hidden_states,
attention_mask=extended_mask,
return_dict=True)
dec_out = self.decoder(
input_ids=decoder_input_ids,
attention_mask=decoder_attention_mask,
encoder_hidden_states=enc_out.last_hidden_state,
encoder_attention_mask=extended_mask,
return_dict=True)
logits = self.lm_head(dec_out.last_hidden_state) + self.final_logits_bias
loss = None
if labels is not None:
loss_fct = nn.CrossEntropyLoss(ignore_index=-100)
loss = loss_fct(logits.view(-1, logits.size(-1)), labels.view(-1))
return Seq2SeqLMOutput(loss=loss, logits=logits)
def prepare_inputs_for_generation(self, *args, **kwargs):
return self.decoder.prepare_inputs_for_generation(*args, **kwargs)
def _prepare_encoder_decoder_kwargs_for_generation(self, *args, **kwargs):
return self.decoder._prepare_encoder_decoder_kwargs_for_generation(*args, **kwargs)
model = ManchuZhMT(manchu_model, chn_model)
print(model)
# freeze Decoder + LM Head
for p in model.decoder.parameters():
p.requires_grad = False
for p in model.lm_head.parameters():
p.requires_grad = False
```
```Python
# Add LoRA for Encoder
from peft import LoraConfig, get_peft_model, TaskType
num_layers = len(model.encoder.layer)
target_modules = []
for i in range(num_layers):
target_modules.extend([
f"encoder.layer.{i}.attention.self.query",
f"encoder.layer.{i}.attention.self.key",
f"encoder.layer.{i}.attention.self.value",
f"encoder.layer.{i}.attention.output.dense",
f"encoder.layer.{i}.intermediate.dense",
f"encoder.layer.{i}.output.dense",
])
lora_config = LoraConfig(
task_type=TaskType.SEQ_2_SEQ_LM,
target_modules=target_modules,
r=16,
lora_alpha=32,
lora_dropout=0.05,
bias="none",
)
model = get_peft_model(model, lora_config)
model.print_trainable_parameters()
```
```Python
# Start Train!
from transformers import Seq2SeqTrainer, Seq2SeqTrainingArguments
args = Seq2SeqTrainingArguments(
output_dir="./lora_with_bert",
per_device_train_batch_size=batch_size,
per_device_eval_batch_size=batch_size,
num_train_epochs=10,
learning_rate=3e-4,
fp16=True,
save_strategy="epoch",
predict_with_generate=True,
logging_steps=100,
report_to="none",
)
trainer = Seq2SeqTrainer(
model=model,
args=args,
train_dataset=tokenized_ds["train"],
eval_dataset=tokenized_ds["val"],
tokenizer=manchu_tok,
)
trainer.train()
trainer.save_model("./lora_with_bert/final")
```
### Expected behavior
I expected the script to train normally just as using `opus-mt-en-zh` only and get the lora checkpoint. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.com/users/ArthurZucker/followers",
"following_url": "https://api.github.com/users/ArthurZucker/following{/other_user}",
"gists_url": "https://api.github.com/users/ArthurZucker/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArthurZucker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArthurZucker/subscriptions",
"organizations_url": "https://api.github.com/users/ArthurZucker/orgs",
"repos_url": "https://api.github.com/users/ArthurZucker/repos",
"events_url": "https://api.github.com/users/ArthurZucker/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArthurZucker/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39542/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39542/timeline | null | completed | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | true |
https://api.github.com/repos/huggingface/transformers/issues/39541 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39541/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39541/comments | https://api.github.com/repos/huggingface/transformers/issues/39541/events | https://github.com/huggingface/transformers/pull/39541 | 3,246,733,089 | PR_kwDOCUB6oc6fwMP8 | 39,541 | Add Muon optimizer implementation and integration | {
"login": "kadirnar",
"id": 36204372,
"node_id": "MDQ6VXNlcjM2MjA0Mzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/36204372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kadirnar",
"html_url": "https://github.com/kadirnar",
"followers_url": "https://api.github.com/users/kadirnar/followers",
"following_url": "https://api.github.com/users/kadirnar/following{/other_user}",
"gists_url": "https://api.github.com/users/kadirnar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kadirnar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kadirnar/subscriptions",
"organizations_url": "https://api.github.com/users/kadirnar/orgs",
"repos_url": "https://api.github.com/users/kadirnar/repos",
"events_url": "https://api.github.com/users/kadirnar/events{/privacy}",
"received_events_url": "https://api.github.com/users/kadirnar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | {
"login": "3outeille",
"id": 47445085,
"node_id": "MDQ6VXNlcjQ3NDQ1MDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/47445085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/3outeille",
"html_url": "https://github.com/3outeille",
"followers_url": "https://api.github.com/users/3outeille/followers",
"following_url": "https://api.github.com/users/3outeille/following{/other_user}",
"gists_url": "https://api.github.com/users/3outeille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/3outeille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/3outeille/subscriptions",
"organizations_url": "https://api.github.com/users/3outeille/orgs",
"repos_url": "https://api.github.com/users/3outeille/repos",
"events_url": "https://api.github.com/users/3outeille/events{/privacy}",
"received_events_url": "https://api.github.com/users/3outeille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "3outeille",
"id": 47445085,
"node_id": "MDQ6VXNlcjQ3NDQ1MDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/47445085?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/3outeille",
"html_url": "https://github.com/3outeille",
"followers_url": "https://api.github.com/users/3outeille/followers",
"following_url": "https://api.github.com/users/3outeille/following{/other_user}",
"gists_url": "https://api.github.com/users/3outeille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/3outeille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/3outeille/subscriptions",
"organizations_url": "https://api.github.com/users/3outeille/orgs",
"repos_url": "https://api.github.com/users/3outeille/repos",
"events_url": "https://api.github.com/users/3outeille/events{/privacy}",
"received_events_url": "https://api.github.com/users/3outeille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | [] | 2025-07-20T22:08:39 | 2025-08-31T12:45:49 | null | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39541",
"html_url": "https://github.com/huggingface/transformers/pull/39541",
"diff_url": "https://github.com/huggingface/transformers/pull/39541.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39541.patch",
"merged_at": null
} | https://github.com/huggingface/transformers/issues/39537
cc @muellerzr @SunMarc
| null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39541/reactions",
"total_count": 9,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 9,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39541/timeline | null | null | null | null | true | false |
https://api.github.com/repos/huggingface/transformers/issues/39540 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39540/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39540/comments | https://api.github.com/repos/huggingface/transformers/issues/39540/events | https://github.com/huggingface/transformers/pull/39540 | 3,246,653,074 | PR_kwDOCUB6oc6fv8xs | 39,540 | [Voxtral] Fix typo | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/users/NielsRogge/followers",
"following_url": "https://api.github.com/users/NielsRogge/following{/other_user}",
"gists_url": "https://api.github.com/users/NielsRogge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NielsRogge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NielsRogge/subscriptions",
"organizations_url": "https://api.github.com/users/NielsRogge/orgs",
"repos_url": "https://api.github.com/users/NielsRogge/repos",
"events_url": "https://api.github.com/users/NielsRogge/events{/privacy}",
"received_events_url": "https://api.github.com/users/NielsRogge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-20T20:18:13 | 2025-07-30T08:16:30 | 2025-07-30T08:16:30 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39540",
"html_url": "https://github.com/huggingface/transformers/pull/39540",
"diff_url": "https://github.com/huggingface/transformers/pull/39540.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39540.patch",
"merged_at": null
} | # What does this PR do?
This PR fixes a small typo in a method name.
cc @eustlb | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/users/NielsRogge/followers",
"following_url": "https://api.github.com/users/NielsRogge/following{/other_user}",
"gists_url": "https://api.github.com/users/NielsRogge/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NielsRogge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NielsRogge/subscriptions",
"organizations_url": "https://api.github.com/users/NielsRogge/orgs",
"repos_url": "https://api.github.com/users/NielsRogge/repos",
"events_url": "https://api.github.com/users/NielsRogge/events{/privacy}",
"received_events_url": "https://api.github.com/users/NielsRogge/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39540/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39540/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39539 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39539/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39539/comments | https://api.github.com/repos/huggingface/transformers/issues/39539/events | https://github.com/huggingface/transformers/pull/39539 | 3,246,622,946 | PR_kwDOCUB6oc6fv2vY | 39,539 | Add AMD test expectations to DETR model | {
"login": "ahadnagy",
"id": 21314428,
"node_id": "MDQ6VXNlcjIxMzE0NDI4",
"avatar_url": "https://avatars.githubusercontent.com/u/21314428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ahadnagy",
"html_url": "https://github.com/ahadnagy",
"followers_url": "https://api.github.com/users/ahadnagy/followers",
"following_url": "https://api.github.com/users/ahadnagy/following{/other_user}",
"gists_url": "https://api.github.com/users/ahadnagy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ahadnagy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ahadnagy/subscriptions",
"organizations_url": "https://api.github.com/users/ahadnagy/orgs",
"repos_url": "https://api.github.com/users/ahadnagy/repos",
"events_url": "https://api.github.com/users/ahadnagy/events{/privacy}",
"received_events_url": "https://api.github.com/users/ahadnagy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-20T19:37:00 | 2025-07-22T12:07:11 | 2025-07-22T12:07:10 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39539",
"html_url": "https://github.com/huggingface/transformers/pull/39539",
"diff_url": "https://github.com/huggingface/transformers/pull/39539.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39539.patch",
"merged_at": "2025-07-22T12:07:10"
} | # What does this PR do?
This model adds the correct test expectations for AMD hardware to the DETR model.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "ahadnagy",
"id": 21314428,
"node_id": "MDQ6VXNlcjIxMzE0NDI4",
"avatar_url": "https://avatars.githubusercontent.com/u/21314428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ahadnagy",
"html_url": "https://github.com/ahadnagy",
"followers_url": "https://api.github.com/users/ahadnagy/followers",
"following_url": "https://api.github.com/users/ahadnagy/following{/other_user}",
"gists_url": "https://api.github.com/users/ahadnagy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ahadnagy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ahadnagy/subscriptions",
"organizations_url": "https://api.github.com/users/ahadnagy/orgs",
"repos_url": "https://api.github.com/users/ahadnagy/repos",
"events_url": "https://api.github.com/users/ahadnagy/events{/privacy}",
"received_events_url": "https://api.github.com/users/ahadnagy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39539/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39539/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39538 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39538/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39538/comments | https://api.github.com/repos/huggingface/transformers/issues/39538/events | https://github.com/huggingface/transformers/pull/39538 | 3,246,615,202 | PR_kwDOCUB6oc6fv1Mr | 39,538 | fix ndim check of device_mesh for TP | {
"login": "winglian",
"id": 381258,
"node_id": "MDQ6VXNlcjM4MTI1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/381258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/winglian",
"html_url": "https://github.com/winglian",
"followers_url": "https://api.github.com/users/winglian/followers",
"following_url": "https://api.github.com/users/winglian/following{/other_user}",
"gists_url": "https://api.github.com/users/winglian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/winglian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/winglian/subscriptions",
"organizations_url": "https://api.github.com/users/winglian/orgs",
"repos_url": "https://api.github.com/users/winglian/repos",
"events_url": "https://api.github.com/users/winglian/events{/privacy}",
"received_events_url": "https://api.github.com/users/winglian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | [] | 2025-07-20T19:27:29 | 2025-07-21T17:35:44 | 2025-07-21T13:09:34 | CONTRIBUTOR | null | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/39538",
"html_url": "https://github.com/huggingface/transformers/pull/39538",
"diff_url": "https://github.com/huggingface/transformers/pull/39538.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/39538.patch",
"merged_at": "2025-07-21T13:09:34"
} | # What does this PR do?
This check is inverted. When I passed in a TP device mesh that's a single ndim, I hit this assertion. The TODO implies that multi-dim isn't supported yet, as well as the error message indicates that the check should be the inverse of the current check.
@SunMarc @S1ro1
Fixes # (issue)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#create-a-pull-request),
Pull Request section?
- [ ] Was this discussed/approved via a Github issue or the [forum](https://discuss.huggingface.co/)? Please add a link
to it if that's the case.
- [ ] Did you make sure to update the documentation with your changes? Here are the
[documentation guidelines](https://github.com/huggingface/transformers/tree/main/docs), and
[here are tips on formatting docstrings](https://github.com/huggingface/transformers/tree/main/docs#writing-source-documentation).
- [ ] Did you write any new necessary tests?
## Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
<!-- Your PR will be replied to more quickly if you can figure out the right person to tag with @
If you know how to use git blame, that is the easiest way, otherwise, here is a rough guide of **who to tag**.
Please tag fewer than 3 people.
Models:
- text models: @ArthurZucker
- vision models: @amyeroberts, @qubvel
- speech models: @eustlb
- graph models: @clefourrier
Library:
- flax: @gante and @Rocketknight1
- generate: @zucchini-nlp (visual-language models) or @gante (all others)
- pipelines: @Rocketknight1
- tensorflow: @gante and @Rocketknight1
- tokenizers: @ArthurZucker
- trainer: @zach-huggingface, @SunMarc and @qgallouedec
- chat templates: @Rocketknight1
Integrations:
- deepspeed: HF Trainer/Accelerate: @SunMarc @zach-huggingface
- ray/raytune: @richardliaw, @amogkam
- Big Model Inference: @SunMarc
- quantization (bitsandbytes, autogpt): @SunMarc @MekkCyber
Documentation: @stevhliu
HF projects:
- accelerate: [different repo](https://github.com/huggingface/accelerate)
- datasets: [different repo](https://github.com/huggingface/datasets)
- diffusers: [different repo](https://github.com/huggingface/diffusers)
- rust tokenizers: [different repo](https://github.com/huggingface/tokenizers)
Maintained examples (not research project or legacy):
- Flax: @Rocketknight1
- PyTorch: See Models above and tag the person corresponding to the modality of the example.
- TensorFlow: @Rocketknight1
-->
| {
"login": "SunMarc",
"id": 57196510,
"node_id": "MDQ6VXNlcjU3MTk2NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/57196510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SunMarc",
"html_url": "https://github.com/SunMarc",
"followers_url": "https://api.github.com/users/SunMarc/followers",
"following_url": "https://api.github.com/users/SunMarc/following{/other_user}",
"gists_url": "https://api.github.com/users/SunMarc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SunMarc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SunMarc/subscriptions",
"organizations_url": "https://api.github.com/users/SunMarc/orgs",
"repos_url": "https://api.github.com/users/SunMarc/repos",
"events_url": "https://api.github.com/users/SunMarc/events{/privacy}",
"received_events_url": "https://api.github.com/users/SunMarc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39538/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39538/timeline | null | null | null | null | true | true |
https://api.github.com/repos/huggingface/transformers/issues/39537 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/39537/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/39537/comments | https://api.github.com/repos/huggingface/transformers/issues/39537/events | https://github.com/huggingface/transformers/issues/39537 | 3,246,453,182 | I_kwDOCUB6oc7BgPG- | 39,537 | Add muon and flash-muon optimizer | {
"login": "kadirnar",
"id": 36204372,
"node_id": "MDQ6VXNlcjM2MjA0Mzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/36204372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kadirnar",
"html_url": "https://github.com/kadirnar",
"followers_url": "https://api.github.com/users/kadirnar/followers",
"following_url": "https://api.github.com/users/kadirnar/following{/other_user}",
"gists_url": "https://api.github.com/users/kadirnar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kadirnar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kadirnar/subscriptions",
"organizations_url": "https://api.github.com/users/kadirnar/orgs",
"repos_url": "https://api.github.com/users/kadirnar/repos",
"events_url": "https://api.github.com/users/kadirnar/events{/privacy}",
"received_events_url": "https://api.github.com/users/kadirnar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | null | [] | 2025-07-20T15:27:30 | 2025-07-20T15:27:30 | null | CONTRIBUTOR | null | null | null | null | ### Feature request
Muon: https://github.com/KellerJordan/Muon
Flash-Muon: https://github.com/nil0x9/flash-muon
Paper: https://arxiv.org/pdf/2502.16982
### Motivation
An effective optimizer method to further accelerate LLM training. The Muon team has recently proven the importance of the muon optimizer in the LLM models they released.
<img width="2536" height="1016" alt="Image" src="https://github.com/user-attachments/assets/93ccc436-f66b-4970-be55-074e5a36fa3d" />
### Your contribution
I want to add the transformers library to this optimization. | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/39537/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/39537/timeline | null | null | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | {
"blocked_by": 0,
"total_blocked_by": 0,
"blocking": 0,
"total_blocking": 0
} | false | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.