Diffusion Single File
comfyui

Black image output in FP16 compute dtype.

#7
by shiboishi - opened

Is it possible to inference with fp16 computedtype on ComfyUI, i'm getting black image? Otherwise genning with my old RTX pre-3000 GPU is very slow due to upcasting to fp32?

This flag worked for me: "--fp16-unet"

This flag worked for me: "--fp16-unet"

Weird, still black output.
"RuntimeWarning: invalid value encountered in cast
img = Image.fromarray(np.clip(i, 0, 255).astype(np.uint8))"

Have you tried '--force-upcast-attention'?
Remove any other attentions like SageAttn and then try.

Yeah, still black. FP16 doesn't work at all. If you're getting a proper image i guess your GPU supports bf16. And it's kind of crazy that not using GPU at all (--novram) is almost as slow as using non-bf16 GPU.

Someone made a patch for the model to work in fp16 for comfyui like an hour ago, with it the model is about 2x slower than sdxl.

Someone made a patch for the model to work in fp16 for comfyui like an hour ago, with it the model is about 2x slower than sdxl.

Any links?
EDIT: found it
https://civitai.com/models/2356447/anima-fp8

Sign up or log in to comment