setup issues

#9
by edwixx - opened

i am running into transformers issues despite i've been using the same version provided in the requirements

error

(workspace) root@dc90a6d6f2c5:/workspace/FlashLabs-Chroma# uv run python example.py Traceback (most recent call last): File "/workspace/FlashLabs-Chroma/example.py", line 2, in from chroma.modeling_chroma import ChromaForConditionalGeneration File "/workspace/FlashLabs-Chroma/chroma/modeling_chroma.py", line 22, in from .configuration_chroma import ChromaConfig, ChromaDecoderConfig, ChromaBackboneConfig File "/workspace/FlashLabs-Chroma/chroma/configuration_chroma.py", line 21, in from transformers.modeling_rope_utils import RopeParameters, RotaryEmbeddingConfigMixin ImportError: cannot import name 'RopeParameters' from 'transformers.modeling_rope_utils' (/workspace/.venv/lib/python3.11/site-packages/transformers/modeling_rope_utils.py) (workspace) root@dc90a6d6f2c5:/workspace/FlashLabs-Chroma# uv pip list

my packages.

`
(workspace) root@dc90a6d6f2c5:/workspace/FlashLabs-Chroma# uv pip list
Using Python 3.11.10 environment at: /workspace/.venv
Package Version

certifi 2026.1.4
charset-normalizer 3.4.4
cuda-bindings 12.9.4
cuda-pathfinder 1.3.3
filelock 3.20.3
fsspec 2026.1.0
hf-xet 1.2.0
huggingface-hub 0.36.0
idna 3.11
jinja2 3.1.6
markupsafe 3.0.3
mpmath 1.3.0
networkx 3.6.1
numpy 2.4.1
nvidia-cublas-cu12 12.8.4.1
nvidia-cuda-cupti-cu12 12.8.90
nvidia-cuda-nvrtc-cu12 12.8.93
nvidia-cuda-runtime-cu12 12.8.90
nvidia-cudnn-cu12 9.10.2.21
nvidia-cufft-cu12 11.3.3.83
nvidia-cufile-cu12 1.13.1.3
nvidia-curand-cu12 10.3.9.90
nvidia-cusolver-cu12 11.7.3.90
nvidia-cusparse-cu12 12.5.8.93
nvidia-cusparselt-cu12 0.7.1
nvidia-nccl-cu12 2.27.5
nvidia-nvjitlink-cu12 12.8.93
nvidia-nvshmem-cu12 3.4.5
nvidia-nvtx-cu12 12.8.90
packaging 26.0
pillow 12.1.0
pyyaml 6.0.3
regex 2026.1.15
requests 2.32.5
safetensors 0.7.0
sympy 1.14.0
tokenizers 0.21.4
torch 2.10.0
torchaudio 2.10.0
torchvision 0.25.0
tqdm 4.67.1
transformers 4.52.3
triton 3.6.0
typing-extensions 4.15.0
urllib3 2.6.3
(workspace) root@dc90a6d6f2c5:/workspace/FlashLabs-Chroma#
`

i have tried all the latest versions of transformers and this seems a constant issue

you can try transformers ==5.0.0rc1, we tested it and works

Sign up or log in to comment