2601版本2G的模型使用报错
ModelPatchLoader
Error(s) in loading state_dict for ZImage_Control:
Missing key(s) in state_dict: "control_layers.3.attention.qkv.weight", "control_layers.3.attention.out.weight", "control_layers.3.attention.q_norm.weight", "control_layers.3.attention.k_norm.weight", "control_layers.3.feed_forward.w1.weight", "control_layers.3.feed_forward.w2.weight", "control_layers.3.feed_forward.w3.weight", "control_layers.3.attention_norm1.weight", "control_layers.3.ffn_norm1.weight", "control_layers.3.attention_norm2.weight", "control_layers.3.ffn_norm2.weight", "control_layers.3.adaLN_modulation.0.weight", "control_layers.3.adaLN_modulation.0.bias", "control_layers.3.after_proj.weight", "control_layers.3.after_proj.bias", "control_layers.4.attention.qkv.weight", "control_layers.4.attention.out.weight", "control_layers.4.attention.q_norm.weight", "control_layers.4.attention.k_norm.weight", "control_layers.4.feed_forward.w1.weight", "control_layers.4.feed_forward.w2.weight", "control_layers.4.feed_forward.w3.weight", "control_layers.4.attention_norm1.weight", "control_layers.4.ffn_norm1.weight", "control_layers.4.attention_norm2.weight", "control_layers.4.ffn_norm2.weight", "control_layers.4.adaLN_modulation.0.weight", "control_layers.4.adaLN_modulation.0.bias", "control_layers.4.after_proj.weight", "control_layers.4.after_proj.bias", "control_layers.5.attention.qkv.weight", "control_layers.5.attention.out.weight", "control_layers.5.attention.q_norm.weight", "control_layers.5.attention.k_norm.weight", "control_layers.5.feed_forward.w1.weight", "control_layers.5.feed_forward.w2.weight", "control_layers.5.feed_forward.w3.weight", "control_layers.5.attention_norm1.weight", "control_layers.5.ffn_norm1.weight", "control_layers.5.attention_norm2.weight", "control_layers.5.ffn_norm2.weight", "control_layers.5.adaLN_modulation.0.weight", "control_layers.5.adaLN_modulation.0.bias", "control_layers.5.after_proj.weight", "control_layers.5.after_proj.bias".
Unexpected key(s) in state_dict: "control_noise_refiner.0.after_proj.bias", "control_noise_refiner.0.after_proj.weight", "control_noise_refiner.0.before_proj.bias", "control_noise_refiner.0.before_proj.weight", "control_noise_refiner.1.after_proj.bias", "control_noise_refiner.1.after_proj.weight".
size mismatch for control_all_x_embedder.2-1.weight: copying a param with shape torch.Size([3840, 132]) from checkpoint, the shape in current model is torch.Size([3840, 64]).
i have the same error. I see people recommend updating comfui/nodes and set to nightly but this doesn't resolve the issue
We might need ComfyUI to add support for this. I'll ask about it tomorrow when I have time.
Should Model 2601 be placed in Personalized_Model or model_patches?
