Update README.md
Browse files
README.md
CHANGED
|
@@ -4,4 +4,34 @@ license: apache-2.0
|
|
| 4 |
|
| 5 |
https://github.com/mit-han-lab/ComfyUI-nunchaku/issues/340
|
| 6 |
|
| 7 |
-
https://gist.github.com/akedia/e0a132b587e30413665d299ad893a60e
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
|
| 5 |
https://github.com/mit-han-lab/ComfyUI-nunchaku/issues/340
|
| 6 |
|
| 7 |
+
https://gist.github.com/akedia/e0a132b587e30413665d299ad893a60e
|
| 8 |
+
|
| 9 |
+
```Running convert_to_comfyui_lora.py
|
| 10 |
+
π Universal final_layer.adaLN LoRA patcher (.safetensors)
|
| 11 |
+
Enter path to input LoRA .safetensors file: Flux_kontext_deblur.safetensors
|
| 12 |
+
Enter path to save patched LoRA .safetensors file: flux_kontext_deblur_comfyui.safetensors
|
| 13 |
+
|
| 14 |
+
β
Loaded 610 tensors from: Flux_kontext_deblur.safetensors
|
| 15 |
+
|
| 16 |
+
π Found these final_layer-related keys:
|
| 17 |
+
lora_unet_final_layer_linear.lora_down.weight
|
| 18 |
+
lora_unet_final_layer_linear.lora_up.weight
|
| 19 |
+
|
| 20 |
+
π Checking for final_layer keys with prefix: 'lora_unet_final_layer'
|
| 21 |
+
Linear down: lora_unet_final_layer_linear.lora_down.weight
|
| 22 |
+
Linear up: lora_unet_final_layer_linear.lora_up.weight
|
| 23 |
+
β
Has final_layer.linear: True
|
| 24 |
+
β
Has final_layer.adaLN_modulation_1: False
|
| 25 |
+
β
Added dummy adaLN weights:
|
| 26 |
+
lora_unet_final_layer_adaLN_modulation_1.lora_down.weight (shape: torch.Size([16, 3072]))
|
| 27 |
+
lora_unet_final_layer_adaLN_modulation_1.lora_up.weight (shape: torch.Size([64, 16]))
|
| 28 |
+
|
| 29 |
+
β
Patched file saved to: flux_kontext_deblur_comfyui.safetensors
|
| 30 |
+
Total tensors now: 612
|
| 31 |
+
|
| 32 |
+
π Verifying patched keys:
|
| 33 |
+
lora_unet_final_layer_adaLN_modulation_1.lora_down.weight
|
| 34 |
+
lora_unet_final_layer_adaLN_modulation_1.lora_up.weight
|
| 35 |
+
lora_unet_final_layer_linear.lora_down.weight
|
| 36 |
+
lora_unet_final_layer_linear.lora_up.weight
|
| 37 |
+
β
Contains adaLN after patch: True```
|