Just a heads up

#11
by Apolonia - opened

For anyone wondering why it doesn't work for you.....
I took the BF16 base & 8-step to give it a whirl, and was flummoxed why it would not run on the default wf, just noise.
I use Z-Image daily, have done since it dropped, so I tried the models in one of my other wf's...nothing but noise! Very strange indeed πŸ€”
Switched out all the default loaders for Kijai's diffusion loader [which I use religiously on Z-Image] STILL, nothing but noise.
I'm stumped.....One last try before I cast your models into the pit of Hades 🀭 I switch weight_dtype from 'Default' to 'BF16' on Kijai's diffusion loader....
HEY PRESTO, Anime πŸ₯‚πŸ₯³
Leukothea_1029
Leukothea_1030
Leukothea_1032

Now, I do not know if this is purely related to my specific machine, but I have never needed to specifically set model weight, or compute type for anything before, including Z-Image, 'Default' usually works as a standard. All I know is this is the ONLY way it will run on my machine, which is absolutely fine and I'm just very happy I solved the issue.
Now looking forward to giving it a nice test run to see what loveliness it can produce.
I thought I'd share my experience for anyone stumped with the same issue.....It's a quick fix 🫑
No idea why it needs BF16 weight_dtype explicitly set, but if it's a recurring issue, perhaps you could tweak the default wf to save the confusionπŸ€·β€β™€οΈ

All the best
Apolonia πŸ’‹

Thank you very much for sharing this β€” that is actually extremely helpful.

For some reason, your ComfyUI setup probably tried to run the model with FP16 instead of BF16. I think that is likely where the noise issue came from before.

Unfortunately, I cannot say exactly why this happened on your machine β€” whether it is related to the workflow config, the loader default behavior, ComfyUI start parameters, or something else. I honestly cannot fully explain it right now.

But your finding is very useful: setting weight_dtype explicitly to BF16 in Kijai’s diffusion loader seems to solve it on your setup.

Thank you again for the detailed report and for not throwing the model into the pit of Hades just yet πŸ˜„

I will keep this in mind and may adjust or note it in the default workflow if more users run into the same issue.

All the best,
SeeSee

As a follow up, I cannot get Base running at all no matter what settings I try. I fed the terminal to Claude
This was his reply. Hope it's helpful 🫑

What's Happening
This isn't technically an error that crashes ComfyUI β€” it's a weight key mismatch between what the model checkpoint contains and what ComfyUI's loader expects.
The Core Problem: Naming Convention Mismatch
The checkpoint (SeeSee21\z-anime-base-bf16.safetensors) uses a different attention layer naming scheme than what ComfyUI's UNet implementation expects.
What ComfyUI expects / What the checkpoint has
attention.qkv.weight / attention.to_q.weight + attention.to_k.weight +attention.to_v.weight
attention.out.weight / attention.to_out.0.weight
attention.q_norm.weight / attention.norm_q.weight
attention.k_norm.weight / attention.norm_k.weight
ComfyUI expects attention weights fused into a single qkv tensor, but this checkpoint stores them as separate q, k, v tensors (a Diffusers/HuggingFace-style convention).
The Other Oddities

all_x_embedder.2-1.* and all_final_layer.2-1.* β€” these are non-standard prefixed keys ComfyUI doesn't recognize at all, suggesting the model has a custom multi-output architecture that ComfyUI has no handler for
x_embedder and final_layer keys are entirely missing from ComfyUI's perspective for the same reason

Why It Won't Run
ComfyUI loads what keys it can match and silently skips the rest. Because every single attention layer fails to map, the model is essentially loading an empty shell β€” it'll either produce pure noise or error during inference.
How to Fix It
Option 1 β€” Convert the checkpoint using a script that renames/fuses the keys to match ComfyUI's expected format (fuse to_q/to_k/to_v β†’ qkv, rename to_out.0 β†’ out, etc.)
Option 2 β€” Use a different frontend like the original repo's inference script if this is a HuggingFace Diffusers-format model
Option 3 β€” Check for a ComfyUI-native version β€” the model author may release a properly converted checkpoint
The BF16 dtype itself is fine; the problem is purely the weight naming/structure.

Apolonia πŸ’‹

Sign up or log in to comment