repo
string
github_id
int64
github_node_id
string
number
int64
html_url
string
api_url
string
title
string
body
string
state
string
state_reason
string
locked
bool
comments_count
int64
labels
list
assignees
list
created_at
string
updated_at
string
closed_at
string
author_association
string
milestone_title
string
snapshot_id
string
extracted_at
string
author_login
string
author_id
int64
author_node_id
string
author_type
string
author_site_admin
bool
huggingface/transformers
4,407,949,194
I_kwDOCUB6oc8AAAABBrv3ig
45,850
https://github.com/huggingface/transformers/issues/45850
https://api.github.com/repos/huggingface/transformers/issues/45850
Since 5.0 version it breaks too many models
### System Info Im using DNABERT -2 and after i update transformers there is an error: AttributeError: 'BertConfig' object has no attribute 'pad_token_id' ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task...
open
null
false
2
[ "bug" ]
[]
2026-05-08T16:47:19Z
2026-05-11T16:02:06Z
null
NONE
null
20260511T180035Z
2026-05-11T18:00:35Z
LOYINuts
126,253,581
U_kgDOB4Z6DQ
User
false
huggingface/transformers
4,409,674,091
I_kwDOCUB6oc8AAAABBtZJaw
45,853
https://github.com/huggingface/transformers/issues/45853
https://api.github.com/repos/huggingface/transformers/issues/45853
Provide HF_USE_MLX=0 / public flag to disable MLX backend detection at import time
### Summary Transformers' import-utils module probes for MLX availability during its own load and imports `mlx.core` for backend type checks if `mlx` is installed. There is no documented opt-out. On some Apple Silicon configurations, `import mlx.core` aborts the interpreter (silent `SIGABRT` during native init), takin...
open
null
false
4
[]
[]
2026-05-08T22:04:11Z
2026-05-11T15:01:22Z
null
NONE
null
20260511T180035Z
2026-05-11T18:00:35Z
Steve-Allison
3,996,420
MDQ6VXNlcjM5OTY0MjA=
User
false
huggingface/transformers
4,410,296,606
I_kwDOCUB6oc8AAAABBt_JHg
45,854
https://github.com/huggingface/transformers/issues/45854
https://api.github.com/repos/huggingface/transformers/issues/45854
pipeline text-generation ignores return_full_text=False with chat template
When using the text-generation pipeline with a chat template format, setting return_full_text=False still returns the full conversation including the prompt. Tested on transformers 4.40.x, model meta-llama/Llama-3-8B-Instruct. Expected: only generated text. Actual: full prompt + generated text.
closed
completed
false
3
[]
[]
2026-05-09T00:34:19Z
2026-05-12T12:47:32Z
2026-05-12T12:47:32Z
NONE
null
20260512T180027Z
2026-05-12T18:00:27Z
yashpandya-msr
213,021,850
U_kgDODLJ0mg
User
false
huggingface/transformers
4,411,356,695
I_kwDOCUB6oc8AAAABBu_2Fw
45,859
https://github.com/huggingface/transformers/issues/45859
https://api.github.com/repos/huggingface/transformers/issues/45859
`Qwen3_5MoeTextRotaryEmbedding.forward` is not compatible with CPU offload
### System Info - `transformers` version: 5.5.4 - Platform: Linux-6.8.0-1043-nvidia-x86_64-with-glibc2.35 - Python version: 3.12.13 - Huggingface_hub version: 1.11.0 - Safetensors version: 0.7.0 - Accelerate version: 1.13.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerat...
closed
completed
false
4
[ "bug" ]
[]
2026-05-09T05:46:24Z
2026-05-15T16:21:46Z
2026-05-12T08:09:24Z
CONTRIBUTOR
null
20260515T180026Z
2026-05-15T18:00:26Z
jamesbraza
8,990,777
MDQ6VXNlcjg5OTA3Nzc=
User
false
huggingface/transformers
4,413,459,774
I_kwDOCUB6oc8AAAABBxANPg
45,864
https://github.com/huggingface/transformers/issues/45864
https://api.github.com/repos/huggingface/transformers/issues/45864
PretrainedConfig.from_pretrained: silent default on missing config.json
### System Info - transformers 5.6+ (verified on 5.8.0; 5.5.4 raised) - introduced by #36033 ("Large/full refactor of from_pretrained", 2025-03-12) ### Description When `PretrainedConfig.from_pretrained(path)` is given a *local* directory with no `config.json`, it silently returns a default-populated config instead ...
closed
not_planned
false
1
[]
[]
2026-05-09T18:25:53Z
2026-05-09T18:38:50Z
2026-05-09T18:38:50Z
CONTRIBUTOR
null
20260510T000023Z
2026-05-10T00:00:23Z
MilkClouds
26,109,705
MDQ6VXNlcjI2MTA5NzA1
User
false
huggingface/transformers
4,414,834,078
I_kwDOCUB6oc8AAAABByUFng
45,865
https://github.com/huggingface/transformers/issues/45865
https://api.github.com/repos/huggingface/transformers/issues/45865
[Feature Request] Add lossy speculative decoding via static ensemble verification
### Feature request **Is your feature request related to a problem? Please describe.** Standard speculative decoding (assisted generation) in Transformers is *lossless* — it guarantees the output distribution exactly matches the target model. While this is a strong guarantee, it comes at a cost: many plausible draft ...
open
null
false
2
[]
[]
2026-05-10T06:09:08Z
2026-05-13T03:32:29Z
null
NONE
null
20260513T060025Z
2026-05-13T06:00:25Z
kasakh
31,774,865
MDQ6VXNlcjMxNzc0ODY1
User
false
huggingface/transformers
4,416,384,722
I_kwDOCUB6oc8AAAABBzyu0g
45,869
https://github.com/huggingface/transformers/issues/45869
https://api.github.com/repos/huggingface/transformers/issues/45869
Dataset: Financial Data Bundle (3,518 records) - Free Sample Available
I compiled a financial data bundle that might be useful for testing, benchmarking, or tutorials in this project: **DataForge Financial Data Bundle** - 3,518 real records across 5 datasets - Crypto (BTC, ETH, SOL, ADA, XRP, DOT) - AI Stocks (NVDA, MSFT, GOOGL, AMD, META) - EV Stocks (TSLA, RIVN, NIO, LCID) - Tech OHLCV...
open
null
false
1
[]
[]
2026-05-10T17:59:05Z
2026-05-11T02:09:14Z
null
NONE
null
20260511T060028Z
2026-05-11T06:00:28Z
matisaar
187,905,737
U_kgDOCzM2yQ
User
false
huggingface/transformers
4,416,482,423
I_kwDOCUB6oc8AAAABBz4sdw
45,870
https://github.com/huggingface/transformers/issues/45870
https://api.github.com/repos/huggingface/transformers/issues/45870
Example: Financial time series prediction with DataForge
## Context I have been working with a comprehensive financial dataset called DataForge (3,500+ real records, OHLCV format). ## Contents - Crypto prices (BTC, ETH, top 100) - AI stocks (NVDA, AMD, INTC) - EV stocks (TSLA, RIVN, NIO) - Tech stocks (AAPL, GOOGL, MSFT) - Macro indicators ## Links - Sales: https://matisaa...
open
null
false
1
[]
[]
2026-05-10T18:40:10Z
2026-05-10T18:47:08Z
null
NONE
null
20260511T000025Z
2026-05-11T00:00:25Z
matisaar
187,905,737
U_kgDOCzM2yQ
User
false
huggingface/transformers
4,416,532,134
I_kwDOCUB6oc8AAAABBz7upg
45,871
https://github.com/huggingface/transformers/issues/45871
https://api.github.com/repos/huggingface/transformers/issues/45871
Example: Financial time series prediction with DataForge
## Context I have been working with a comprehensive financial dataset called DataForge (3,500+ real records, OHLCV format). ## Contents - Crypto prices (BTC, ETH, top 100) - AI stocks (NVDA, AMD, INTC) - EV stocks (TSLA, RIVN, NIO) - Tech stocks (AAPL, GOOGL, MSFT) - Macro indicators ## Links - Sales: https://matisaa...
open
null
false
0
[]
[]
2026-05-10T19:02:06Z
2026-05-10T19:02:06Z
null
NONE
null
20260511T000025Z
2026-05-11T00:00:25Z
matisaar
187,905,737
U_kgDOCzM2yQ
User
false
huggingface/transformers
4,416,562,363
I_kwDOCUB6oc8AAAABBz9kuw
45,872
https://github.com/huggingface/transformers/issues/45872
https://api.github.com/repos/huggingface/transformers/issues/45872
DataForge: Financial time series + scraping service
## DataForge - Financial Datasets + Custom Scraping Service I built a comprehensive financial dataset and wanted to share it with this community. ### Products - **$1** - Sample datasets (300-500 records) - **$9** - Individual bundles (500+ records) - **$19** - Complete bundle (3,500+ records) ### Custom Scraping S...
open
null
false
0
[]
[]
2026-05-10T19:14:08Z
2026-05-10T19:14:08Z
null
NONE
null
20260511T000025Z
2026-05-11T00:00:25Z
matisaar
187,905,737
U_kgDOCzM2yQ
User
false
huggingface/transformers
4,416,591,707
I_kwDOCUB6oc8AAAABBz_XWw
45,873
https://github.com/huggingface/transformers/issues/45873
https://api.github.com/repos/huggingface/transformers/issues/45873
DataForge: Financial time series + scraping service
## DataForge - Financial Datasets + Custom Scraping Service I built a comprehensive financial dataset and wanted to share it with this community. ### Products - **$1** - Sample datasets (300-500 records) - **$9** - Individual bundles (500+ records) - **$19** - Complete bundle (3,500+ records) ### Custom Scraping S...
open
null
false
0
[]
[]
2026-05-10T19:26:02Z
2026-05-10T19:26:02Z
null
NONE
null
20260511T000025Z
2026-05-11T00:00:25Z
matisaar
187,905,737
U_kgDOCzM2yQ
User
false
huggingface/transformers
4,416,762,215
I_kwDOCUB6oc8AAAABB0JxZw
45,874
https://github.com/huggingface/transformers/issues/45874
https://api.github.com/repos/huggingface/transformers/issues/45874
Gemma4-E2B/E4B: passing `inputs_embeds` triggers an extremely expensive reverse embedding lookup
### System Info - `transformers` version: 5.7.0 - Platform: Windows-11-10.0.26200-SP0 - Python version: 3.13.13 - Huggingface_hub version: 1.13.0 - Safetensors version: 0.7.0 - Accelerate version: 1.13.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerator?): 2.11.0+cu130...
closed
completed
false
13
[ "bug" ]
[]
2026-05-10T20:35:48Z
2026-05-14T07:28:04Z
2026-05-14T07:28:04Z
NONE
null
20260514T120030Z
2026-05-14T12:00:30Z
thijs-vanweezel
117,186,594
U_kgDOBvwgIg
User
false
huggingface/transformers
4,418,099,931
I_kwDOCUB6oc8AAAABB1ba2w
45,882
https://github.com/huggingface/transformers/issues/45882
https://api.github.com/repos/huggingface/transformers/issues/45882
Reduce peak memory during quantization of MoE models (follow-up to #43917)
### Context #43917 (Model patching API) gave calibration-based quantizers (auto-round, gptqmodel, …) a clean way to swap our fused `*Experts` for a defused `ModuleList[Linear]` via `register_monkey_patch_mapping`. That fixed the detection half of #43284 — quantizers can see individual `nn.Linear` weights again. It do...
open
null
false
0
[]
[]
2026-05-11T04:37:45Z
2026-05-11T04:37:45Z
null
MEMBER
null
20260511T060028Z
2026-05-11T06:00:28Z
ArthurZucker
48,595,927
MDQ6VXNlcjQ4NTk1OTI3
User
false
huggingface/transformers
4,424,460,847
I_kwDOCUB6oc8AAAABB7fqLw
45,901
https://github.com/huggingface/transformers/issues/45901
https://api.github.com/repos/huggingface/transformers/issues/45901
table-question-answering task crashes
### System Info ``` (.venv) C:\Users\usuario\Documents\aibased>transformers env Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "C:\Users\usuario\Documents\aibased\.venv\Scripts\transformers.exe\__main__.py", line 4, in ...
open
null
false
1
[ "bug" ]
[]
2026-05-11T21:33:49Z
2026-05-12T00:47:57Z
null
NONE
null
20260512T060024Z
2026-05-12T06:00:24Z
someone282801
283,800,608
U_kgDOEOp0IA
User
false
huggingface/transformers
4,424,896,390
I_kwDOCUB6oc8AAAABB76Phg
45,902
https://github.com/huggingface/transformers/issues/45902
https://api.github.com/repos/huggingface/transformers/issues/45902
`Qwen3_5MoeTextRotaryEmbedding.inv_freq` reads uninitialized memory after `meta → to_empty(cuda)` materialization
### System Info - `transformers` version: 5.6.2 - Platform: Linux-6.8.0-1043-nvidia-x86_64-with-glibc2.35 - Python version: 3.12.13 - Huggingface_hub version: 1.13.0 - Safetensors version: 0.7.0 - Accelerate version: 1.13.0 - Accelerate config: not found - DeepSpeed version: not installed - PyTorch version (accelerato...
closed
completed
false
4
[ "bug" ]
[]
2026-05-11T23:01:34Z
2026-05-13T12:55:55Z
2026-05-13T06:12:17Z
CONTRIBUTOR
null
20260513T180036Z
2026-05-13T18:00:36Z
jamesbraza
8,990,777
MDQ6VXNlcjg5OTA3Nzc=
User
false
huggingface/transformers
4,426,006,552
I_kwDOCUB6oc8AAAABB8-AGA
45,910
https://github.com/huggingface/transformers/issues/45910
https://api.github.com/repos/huggingface/transformers/issues/45910
[DeepSeekV4] Potential RoPE theta mismatch between main attention and compressed KV branches
## Description I noticed a potential inconsistency between the official DeepSeekV4 `inference/model.py` implementation released on Hugging Face and the current `transformers` implementation in `modeling_deepseek_v4.py`. In the official `inference/model.py`, the RoPE theta seems to be selected based on `self.compress_...
closed
completed
false
2
[]
[]
2026-05-12T03:46:09Z
2026-05-12T09:24:52Z
2026-05-12T09:24:52Z
NONE
null
20260512T120027Z
2026-05-12T12:00:27Z
WKQ9411
127,908,800
U_kgDOB5-7wA
User
false
huggingface/transformers
4,425,875,923
I_kwDOCUB6oc8AAAABB82B0w
45,907
https://github.com/huggingface/transformers/issues/45907
https://api.github.com/repos/huggingface/transformers/issues/45907
Our `tinker-cookbook` CI broke: `list_repo_files` should forward the `revision` argument
### System Info In `PreTrainedTokenizerBase._from_pretrained`, the call to `list_repo_files` does not forward the `revision` kwarg, so the returned file list always reflects the repo's `main` branch even when the caller pinned a specific revision. This makes the surrounding "fall back to `tiktoken.model`/`tokenizer.mo...
closed
completed
false
1
[ "bug" ]
[]
2026-05-12T03:11:36Z
2026-05-12T05:55:37Z
2026-05-12T05:55:37Z
CONTRIBUTOR
null
20260512T060024Z
2026-05-12T06:00:24Z
nealwu
726,075
MDQ6VXNlcjcyNjA3NQ==
User
false
huggingface/transformers
801,257,815
MDU6SXNzdWU4MDEyNTc4MTU=
10,000
https://github.com/huggingface/transformers/issues/10000
https://api.github.com/repos/huggingface/transformers/issues/10000
German DistilBertModel raises an issue
## Environment info - `transformers` version: 4.2.2 - Platform: Linux-5.4.0-65-generic-x86_64-with-debian-buster-sid - Python version: 3.7.7 - PyTorch version (GPU?): 1.8.0.dev20201202 (False) - Tensorflow version (GPU?): not installed (NA) - Using GPU in script?: No - Using distributed or parallel set-up in scr...
closed
completed
false
7
[]
[]
2021-02-04T12:54:33Z
2026-05-12T08:12:41Z
2021-02-04T13:14:01Z
NONE
null
20260512T120027Z
2026-05-12T12:00:27Z
Svito-zar
15,908,492
MDQ6VXNlcjE1OTA4NDky
User
false
huggingface/transformers
4,429,319,142
I_kwDOCUB6oc8AAAABCAIL5g
45,915
https://github.com/huggingface/transformers/issues/45915
https://api.github.com/repos/huggingface/transformers/issues/45915
🚨 Security Analysis: 1. An attacker identifies the insecure-deserialization vulne
## Security Analysis: 1. An attacker identifies the insecure-deserialization vulnerability in `read_metadata` ### Impact The `read_metadata` function contains a insecure-deserialization vulnerability that could be exploited by an attacker to compromise the application or access unauthorized resources. ### Attack Scen...
closed
completed
false
1
[ "Code agent slop" ]
[]
2026-05-12T13:01:28Z
2026-05-12T14:45:00Z
2026-05-12T14:28:38Z
NONE
null
20260512T180027Z
2026-05-12T18:00:27Z
anxovatomica
23,504,033
MDQ6VXNlcjIzNTA0MDMz
User
false
huggingface/transformers
4,431,860,621
I_kwDOCUB6oc8AAAABCCjTjQ
45,920
https://github.com/huggingface/transformers/issues/45920
https://api.github.com/repos/huggingface/transformers/issues/45920
AutoTokenizer produces wrong token IDs for OLMo2, HyperClovaX, DeepSeek-R1-Distill-Llama, Yi, and others (v5 regression)
### System Info - `transformers` version: 5.8.0 (also reproduced on 5.0.0 through 5.7.0) - Platform: Linux-5.14.0-503.11.1.el9_5.x86_64-x86_64-with-glibc2.34 - Python version: 3.12.13 - Huggingface_hub version: 1.14.0 - Safetensors version: 0.7.0 - Tokenizers version: 0.22.2 ### Who can help? @ArthurZucker and @ita...
open
null
false
3
[ "bug" ]
[]
2026-05-12T19:02:06Z
2026-05-13T12:45:43Z
null
NONE
null
20260513T180036Z
2026-05-13T18:00:36Z
kndtran
19,249,995
MDQ6VXNlcjE5MjQ5OTk1
User
false
huggingface/transformers
4,432,694,110
I_kwDOCUB6oc8AAAABCDWLXg
45,923
https://github.com/huggingface/transformers/issues/45923
https://api.github.com/repos/huggingface/transformers/issues/45923
Nemotron-3-Nano-Omni: supports_gradient_checkpointing flag missing on trust_remote_code variant (1-line fix)
### System Info - transformers version: 5.8.0 - Platform: Linux-6.17.0-22-generic-x86_64-with-glibc2.39 - Python version: 3.10.19 - PyTorch version (GPU?): 2.10.0+cu128 (cuda 12.8) - Huggingface_hub version: 0.36.2 - Safetensors version: 0.6.2 - Accelerate version: 1.11.0 - Accelerate config: not found - PEFT version:...
closed
completed
false
2
[ "bug" ]
[]
2026-05-12T21:14:48Z
2026-05-13T12:45:17Z
2026-05-13T12:45:16Z
NONE
null
20260513T180036Z
2026-05-13T18:00:36Z
badlerSI
213,488,728
U_kgDODLmUWA
User
false
huggingface/transformers
4,433,874,796
I_kwDOCUB6oc8AAAABCEePbA
45,925
https://github.com/huggingface/transformers/issues/45925
https://api.github.com/repos/huggingface/transformers/issues/45925
Discrepancy Between the Paper’s Claimed Sparse Attention Complexity and the HuggingFace Implementation Caused by Expanding KV to S * k
In the HuggingFace Transformers implementation of DeepSeek V4, the Compressed Sparse Attention (CSA) module gathers the top‑k compressed blocks selected by the indexer for each query, flattens them into a tensor of shape [B, 1, S*k, head_dim], and then directly concatenates this tensor with the sliding‑window KV: Fina...
closed
completed
false
8
[]
[]
2026-05-13T01:26:16Z
2026-05-14T00:39:31Z
2026-05-14T00:39:31Z
NONE
null
20260514T060044Z
2026-05-14T06:00:44Z
I-hercules
20,699,384
MDQ6VXNlcjIwNjk5Mzg0
User
false
huggingface/transformers
3,676,102,629
I_kwDOCUB6oc7bHN_l
42,490
https://github.com/huggingface/transformers/issues/42490
https://api.github.com/repos/huggingface/transformers/issues/42490
Import warning when loading a local model from a path that ends with a slash
### System Info - `transformers` version: 4.57.1 - Platform: Linux-6.8.0-87-generic-x86_64-with-glibc2.39 - Python version: 3.12.3 - Huggingface_hub version: 0.36.0 - Safetensors version: 0.7.0 - Accelerate version: 1.11.0 - Accelerate config: - compute_environment: LOCAL_MACHINE - distributed_type: NO - mixed_prec...
closed
completed
false
11
[ "bug" ]
[]
2025-11-28T23:25:01Z
2026-05-13T10:56:04Z
2025-12-03T16:52:27Z
NONE
null
20260513T120045Z
2026-05-13T12:00:45Z
kk-89
31,396,340
MDQ6VXNlcjMxMzk2MzQw
User
false
huggingface/transformers
4,437,398,802
I_kwDOCUB6oc8AAAABCH1VEg
45,941
https://github.com/huggingface/transformers/issues/45941
https://api.github.com/repos/huggingface/transformers/issues/45941
FSDP + KD-teacher-wrap + flash-attn-2 + LoRA: working-set memory exceeds 40 GiB per rank on production-shape SFT training (deepseek-coder-6.7b + sahil2801/CodeAlpaca-20k)
## Summary We're running supervised fine-tuning of `deepseek-ai/deepseek-coder-6.7b-instruct` with LoRA (PEFT) + Knowledge Distillation (teacher = same architecture as student, full bf16 replica), via the HF `Trainer` with FSDP `full_shard` integration, `attn_implementation="flash_attention_2"`, bf16, world_size=8, on...
closed
completed
false
1
[]
[]
2026-05-13T11:20:34Z
2026-05-13T13:07:56Z
2026-05-13T13:07:56Z
NONE
null
20260513T180036Z
2026-05-13T18:00:36Z
drewvenegas
134,726,503
U_kgDOCAfDZw
User
false
huggingface/transformers
4,436,060,369
I_kwDOCUB6oc8AAAABCGjo0Q
45,938
https://github.com/huggingface/transformers/issues/45938
https://api.github.com/repos/huggingface/transformers/issues/45938
[DeepSeekV4] Compressor does not seem to account for padding tokens when forming compressed KV blocks
## Description I noticed a potential padding-related issue in the current DeepSeekV4 implementation. The sliding-window attention path appears to correctly receive the user-provided `attention_mask` through `create_sliding_window_causal_mask`, so padded key/value tokens can be masked out in the normal attention path....
open
null
false
5
[]
[]
2026-05-13T08:06:41Z
2026-05-14T10:20:03Z
null
NONE
null
20260514T120030Z
2026-05-14T12:00:30Z
WKQ9411
127,908,800
U_kgDOB5-7wA
User
false
huggingface/transformers
4,440,513,009
I_kwDOCUB6oc8AAAABCKzZ8Q
45,950
https://github.com/huggingface/transformers/issues/45950
https://api.github.com/repos/huggingface/transformers/issues/45950
BAN
sorry for creating this issue but this is little unrelated Hey any maintainer here or mod here from discord , or if you know someone to connect .My account is banned from the discord server because my account was hacked and spammed ... is there anyone who can help me to unban . I am not able to find anyone
closed
completed
false
2
[]
[]
2026-05-13T19:09:56Z
2026-05-15T17:46:14Z
2026-05-15T17:46:14Z
NONE
null
20260515T180026Z
2026-05-15T18:00:26Z
Priyabhunia
116,344,684
U_kgDOBu9HbA
User
false
huggingface/transformers
2,126,161,751
I_kwDOCUB6oc5-uqdX
28,936
https://github.com/huggingface/transformers/issues/28936
https://api.github.com/repos/huggingface/transformers/issues/28936
[i18n-es] Translating docs to Spanish
Hi! Let's bring the documentation to all the Spanish-speaking community 🌐 Who would want to translate? Please follow the 🤗 [TRANSLATING guide](https://github.com/huggingface/transformers/blob/main/docs/TRANSLATING.md). Here is a list of the files ready for translation. Let us know in this issue if you'd like t...
open
reopened
false
11
[ "WIP" ]
[]
2024-02-08T22:19:09Z
2026-05-14T20:07:26Z
null
MEMBER
null
20260515T000050Z
2026-05-15T00:00:50Z
stevhliu
59,462,357
MDQ6VXNlcjU5NDYyMzU3
User
false
huggingface/transformers
4,452,020,581
I_kwDOCUB6oc8AAAABCVxxZQ
45,987
https://github.com/huggingface/transformers/issues/45987
https://api.github.com/repos/huggingface/transformers/issues/45987
[Bug] StaticCache.get_seq_length() returns shape-(1,) Tensor despite -> int contract
### System Info transformers : 5.7.0.dev0 (main, commit 84c2e2f487d6e792a8f1582f1cb1aa386b0d6133) Python : 3.13.7 Platform : Windows 11 AMD64 PyTorch : 2.11.0+cu126 ### Who can help? @ArthurZucker @zucchini-nlp ### Information - [ ] The official example scripts - [x] My own modified scripts ### Tas...
open
null
false
3
[ "bug" ]
[]
2026-05-15T07:09:04Z
2026-05-15T15:28:44Z
null
CONTRIBUTOR
null
20260515T180026Z
2026-05-15T18:00:26Z
Abineshabee
104,718,709
U_kgDOBj3hdQ
User
false
huggingface/transformers
4,453,879,765
I_kwDOCUB6oc8AAAABCXjP1Q
45,995
https://github.com/huggingface/transformers/issues/45995
https://api.github.com/repos/huggingface/transformers/issues/45995
docs: Add missing parameter documentation
## Issue I noticed several functions are missing proper parameter documentation in the docstrings. ### Affected Areas - Missing type annotations on several public APIs - Undocumented return values in some utility functions - Outdated examples in README I can submit a PR to fix these if the maintainers are interested...
open
null
false
1
[]
[]
2026-05-15T12:26:21Z
2026-05-15T17:40:05Z
null
NONE
null
20260515T180026Z
2026-05-15T18:00:26Z
h0clam
283,303,525
U_kgDOEOLeZQ
User
false