Error with ip-adapter_pulidv1.1_sdxl_fp16.safetensors

#1
by brucehung - opened

Any help with an error like this

PulidModelLoader
Error(s) in loading state_dict for IDEncoder:
Missing key(s) in state_dict: "body.0.weight", "body.0.bias", "body.1.weight", "body.1.bias", "body.3.weight", "body.3.bias", "body.4.weight", "body.4.bias", "body.6.weight", "body.6.bias", "mapping_patch_0.0.weight", "mapping_patch_0.0.bias", "mapping_patch_0.1.weight", "mapping_patch_0.1.bias", "mapping_patch_0.3.weight", "mapping_patch_0.3.bias", "mapping_patch_0.4.weight", "mapping_patch_0.4.bias", "mapping_patch_0.6.weight", "mapping_patch_0.6.bias", "mapping_patch_1.0.weight", "mapping_patch_1.0.bias", "mapping_patch_1.1.weight", "mapping_patch_1.1.bias", "mapping_patch_1.3.weight", "mapping_patch_1.3.bias", "mapping_patch_1.4.weight", "mapping_patch_1.4.bias", "mapping_patch_1.6.weight", "mapping_patch_1.6.bias", "mapping_patch_2.0.weight", "mapping_patch_2.0.bias", "mapping_patch_2.1.weight", "mapping_patch_2.1.bias", "mapping_patch_2.3.weight", "mapping_patch_2.3.bias", "mapping_patch_2.4.weight", "mapping_patch_2.4.bias", "mapping_patch_2.6.weight", "mapping_patch_2.6.bias", "mapping_patch_3.0.weight", "mapping_patch_3.0.bias", "mapping_patch_3.1.weight", "mapping_patch_3.1.bias", "mapping_patch_3.3.weight", "mapping_patch_3.3.bias", "mapping_patch_3.4.weight", "mapping_patch_3.4.bias", "mapping_patch_3.6.weight", "mapping_patch_3.6.bias", "mapping_patch_4.0.weight", "mapping_patch_4.0.bias", "mapping_patch_4.1.weight", "mapping_patch_4.1.bias", "mapping_patch_4.3.weight", "mapping_patch_4.3.bias", "mapping_patch_4.4.weight", "mapping_patch_4.4.bias", "mapping_patch_4.6.weight", "mapping_patch_4.6.bias".
Unexpected key(s) in state_dict: "id_embedding_mapping.0.bias", "id_embedding_mapping.0.weight", "id_embedding_mapping.1.bias", "id_embedding_mapping.1.weight", "id_embedding_mapping.3.bias", "id_embedding_mapping.3.weight", "id_embedding_mapping.4.bias", "id_embedding_mapping.4.weight", "id_embedding_mapping.6.bias", "id_embedding_mapping.6.weight", "latents", "layers.0.0.norm1.bias", "layers.0.0.norm1.weight", "layers.0.0.norm2.bias", "layers.0.0.norm2.weight", "layers.0.0.to_kv.weight", "layers.0.0.to_out.weight", "layers.0.0.to_q.weight", "layers.0.1.0.bias", "layers.0.1.0.weight", "layers.0.1.1.weight", "layers.0.1.3.weight", "layers.1.0.norm1.bias", "layers.1.0.norm1.weight", "layers.1.0.norm2.bias", "layers.1.0.norm2.weight", "layers.1.0.to_kv.weight", "layers.1.0.to_out.weight", "layers.1.0.to_q.weight", "layers.1.1.0.bias", "layers.1.1.0.weight", "layers.1.1.1.weight", "layers.1.1.3.weight", "layers.2.0.norm1.bias", "layers.2.0.norm1.weight", "layers.2.0.norm2.bias", "layers.2.0.norm2.weight", "layers.2.0.to_kv.weight", "layers.2.0.to_out.weight", "layers.2.0.to_q.weight", "layers.2.1.0.bias", "layers.2.1.0.weight", "layers.2.1.1.weight", "layers.2.1.3.weight", "layers.3.0.norm1.bias", "layers.3.0.norm1.weight", "layers.3.0.norm2.bias", "layers.3.0.norm2.weight", "layers.3.0.to_kv.weight", "layers.3.0.to_out.weight", "layers.3.0.to_q.weight", "layers.3.1.0.bias", "layers.3.1.0.weight", "layers.3.1.1.weight", "layers.3.1.3.weight", "layers.4.0.norm1.bias", "layers.4.0.norm1.weight", "layers.4.0.norm2.bias", "layers.4.0.norm2.weight", "layers.4.0.to_kv.weight", "layers.4.0.to_out.weight", "layers.4.0.to_q.weight", "layers.4.1.0.bias", "layers.4.1.0.weight", "layers.4.1.1.weight", "layers.4.1.3.weight", "layers.5.0.norm1.bias", "layers.5.0.norm1.weight", "layers.5.0.norm2.bias", "layers.5.0.norm2.weight", "layers.5.0.to_kv.weight", "layers.5.0.to_out.weight", "layers.5.0.to_q.weight", "layers.5.1.0.bias", "layers.5.1.0.weight", "layers.5.1.1.weight", "layers.5.1.3.weight", "layers.6.0.norm1.bias", "layers.6.0.norm1.weight", "layers.6.0.norm2.bias", "layers.6.0.norm2.weight", "layers.6.0.to_kv.weight", "layers.6.0.to_out.weight", "layers.6.0.to_q.weight", "layers.6.1.0.bias", "layers.6.1.0.weight", "layers.6.1.1.weight", "layers.6.1.3.weight", "layers.7.0.norm1.bias", "layers.7.0.norm1.weight", "layers.7.0.norm2.bias", "layers.7.0.norm2.weight", "layers.7.0.to_kv.weight", "layers.7.0.to_out.weight", "layers.7.0.to_q.weight", "layers.7.1.0.bias", "layers.7.1.0.weight", "layers.7.1.1.weight", "layers.7.1.3.weight", "layers.8.0.norm1.bias", "layers.8.0.norm1.weight", "layers.8.0.norm2.bias", "layers.8.0.norm2.weight", "layers.8.0.to_kv.weight", "layers.8.0.to_out.weight", "layers.8.0.to_q.weight", "layers.8.1.0.bias", "layers.8.1.0.weight", "layers.8.1.1.weight", "layers.8.1.3.weight", "layers.9.0.norm1.bias", "layers.9.0.norm1.weight", "layers.9.0.norm2.bias", "layers.9.0.norm2.weight", "layers.9.0.to_kv.weight", "layers.9.0.to_out.weight", "layers.9.0.to_q.weight", "layers.9.1.0.bias", "layers.9.1.0.weight", "layers.9.1.1.weight", "layers.9.1.3.weight", "proj_out".
size mismatch for mapping_0.6.weight: copying a param with shape torch.Size([1024, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for mapping_0.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for mapping_1.6.weight: copying a param with shape torch.Size([1024, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for mapping_1.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for mapping_2.6.weight: copying a param with shape torch.Size([1024, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for mapping_2.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for mapping_3.6.weight: copying a param with shape torch.Size([1024, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for mapping_3.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([2048]).
size mismatch for mapping_4.6.weight: copying a param with shape torch.Size([1024, 1024]) from checkpoint, the shape in current model is torch.Size([2048, 1024]).
size mismatch for mapping_4.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([2048]).

brucehung changed discussion status to closed

Sign up or log in to comment