Can you please show an example.py to run this model?

#4
by gabbo1995 - opened

Can you please provide in the model card an example of how to run this model with diffusers/nunchaku?

Can we use the standard Flux Schnell pipeline? https://github.com/nunchaku-tech/nunchaku/blob/main/examples/v1/flux.1-schnell.py

like this?

import torch
from diffusers import FluxPipeline

from nunchaku import NunchakuFluxTransformer2DModelV2
from nunchaku.utils import get_precision

precision = get_precision() # auto-detect your precision is 'int4' or 'fp4' based on your GPU
transformer = NunchakuFluxTransformer2DModelV2.from_pretrained(
f"nunchaku-tech/nunchaku-flux.1-schnell/svdq-{precision}_r32-flux.1-schnell.safetensors"
)
pipeline = FluxPipeline.from_pretrained(
"black-forest-labs/FLUX.1-schnell", transformer=transformer, torch_dtype=torch.bfloat16
).to("cuda")
image = pipeline(
"A cat holding a sign that says hello world",
width=1024,
height=1024,
num_inference_steps=4,
guidance_scale=0,
).images[0]
image.save(f"flux.1-schnell-{precision}.png")

QuantStack org
edited Nov 24

Hello, I don't use diffusers to run the model just Comfyui. But the code you provided looks correct as this model can be used in the same way as Flux model.

Thank you very much!! I will try, I was convinced it was somehow different from nunchaku pipelines.

Sign up or log in to comment