0xZohar commited on
Commit
7e77ae8
·
verified ·
1 Parent(s): d3beab3

Fix: Pin peft<0.18.0 to resolve transformers.modeling_layers dependency

Browse files

Root cause:
- PEFT 0.18.0 imports from transformers.modeling_layers (line 26 in lora/model.py)
- transformers.modeling_layers module only exists in v4.50.0+ (introduced April 2025)
- Our constraint transformers>=4.46.0,<4.52.0 includes 4.50-4.51, but module was unstable in these versions
- PEFT 0.18.0 was designed for transformers v5 (not yet released), creating compatibility gap

Solution:
- Pin peft>=0.7.0,<0.18.0 (excludes 0.18.0)
- PEFT 0.17.0 confirmed compatible with transformers 4.46-4.52 (no modeling_layers dependency)
- Maintains all existing constraints (torch 2.2.2, transformers<4.52, safetensors loading)

Official documentation:
- PEFT 0.17.0 setup.py: No transformers version constraints (broad compatibility)
- PEFT 0.18.0 release notes: Designed for transformers v5
- transformers.modeling_layers: First appeared in v4.50.0, stable in v4.53.0+

Error fixed:
ModuleNotFoundError: No module named 'transformers.modeling_layers'

Changes:
- requirements.txt line 21: peft>=0.7.0 → peft>=0.7.0,<0.18.0

Files changed (1) hide show
  1. requirements.txt +2 -1
requirements.txt CHANGED
@@ -17,7 +17,8 @@ tokenizers>=0.15.0
17
  trimesh>=4.0.0
18
 
19
  # Deep Learning Utilities
20
- peft>=0.7.0
 
21
  safetensors>=0.4.0
22
  tqdm>=4.66.0
23
  warp-lang>=1.0.0
 
17
  trimesh>=4.0.0
18
 
19
  # Deep Learning Utilities
20
+ # Pin peft<0.18.0 to avoid transformers.modeling_layers dependency (requires transformers>=4.50)
21
+ peft>=0.7.0,<0.18.0
22
  safetensors>=0.4.0
23
  tqdm>=4.66.0
24
  warp-lang>=1.0.0