Inferencing not working
I'm trying to use your model with the source code you provide in the model card.
I'm running it in a Jupyter Notebook in JupyterLab on an A6000 Ada GPU. torch 2.6, transformers 5.0.0
File ~/.cache/huggingface/modules/transformers_modules/FlashLabs/Chroma_hyphen_4B/864b4aea0c1359f91af62f1367df64657dc5e90f/generation_chroma.py:118, in ChromaGenerationMixin.prepare_generation_config(self, generation_config, use_model_defaults, **kwargs)
115 kwargs = {k: v for k, v in kwargs.items() if not k.startswith("decoder")}
117 # initialize the generation config
--> 118 generation_config, model_kwargs = super()._prepare_generation_config(
119 generation_config, use_model_defaults, **kwargs
120 )
121 self.decoder.generation_config.update(**depth_decoder_kwargs)
123 # ensure the depth decoder generation config is valid
TypeError: GenerationMixin._prepare_generation_config() takes 2 positional arguments but 3 were given
I've tested with torch 2.8 on a Pro 6000 GPU as well. Same result.
Hi! Thanks for reporting this issue.
I tested with a similar environment and it works fine on my end:
torch: 2.7.1
transformers: 5.0.0rc1
accelerate: 1.7.0
Could you try upgrading your packages to match these versions?
Please let me know if upgrading helps, or share your full environment info so we can investigate further!
I followed your recommendation, but it didn't change anything.
What helped was cloning your repository and exchanging the python files in this path
~/.cache/huggingface/modules/transformers_modules/FlashLabs/Chroma_hyphen_4B/864b4aea0c1359f91af62f1367df64657dc5e90f
with the ones provided in your repository in the chroma folder.
For a smooth experience, you should make sure the code that is installed automatically is correct.
I did
pip install "transformers==5.0.0rc1" "torch==2.7.1" "accelerate==1.7.0" #same effects with transformers==5.0.0
pip install "av>=14.0.0" "librosa>=0.11.0" "audioread>=3.0.0" "soundfile>=0.13.0"
that resulted in the buggy chroma script "generation_chroma.py"
the equally named script in your github repository works.