File size: 3,223 Bytes
232e1c5 c3cad08 6eff3c6 c3cad08 d7da8e7 9c6edef d7da8e7 1beea8e 2ad5bc7 232e1c5 8336237 d7da8e7 0ccb48e f4c739c 1beea8e 0ccb48e 1beea8e 0ccb48e 1beea8e 9c351d8 1beea8e f4c739c 1beea8e 8403b94 1beea8e 8403b94 f4c739c 8403b94 1beea8e 8403b94 1beea8e 8403b94 f4c739c 8403b94 1beea8e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
license: apache-2.0
---
## Official Fix (2025-07-28)
The official fix is being tested: https://github.com/nunchaku-tech/nunchaku/pull/557
--
## β
Patched LoRAs
| Patched LoRAs | Original LoRAs |
|-----------------|----------------|
| [flux_kontext_deblur_nunchaku.safetensors](https://huggingface.co/lym00/comfyui_nunchaku_lora_patch/blob/main/flux_kontext_deblur_nunchaku.safetensors) | [civitai](https://civitai.com/models/1737381) or [civitaiarchive](https://civitaiarchive.com/models/1737381) |
| [flux_kontext_face_detailer_nunchaku.safetensors](https://huggingface.co/lym00/comfyui_nunchaku_lora_patch/blob/main/flux_kontext_face_detailer_nunchaku.safetensors) | [civitai](https://civitai.com/models/1752776) or [civitaiarchive](https://civitaiarchive.com/models/1752776) |
## β‘ Usage
- These patched LoRAs are **compatible with** [ComfyUI-nunchaku](https://github.com/mit-han-lab/ComfyUI-nunchaku).
- Use the **Nunchaku FLUX LoRA Loader** node to load LoRA modules for **SVDQuant FLUX** models.
## π οΈ Patch References
Some original FLUX LoRA files were missing required `final_layer.adaLN` weights needed by **ComfyUI-nunchakuβs FLUX LoRA Loader**.
This patch script automatically adds **dummy adaLN tensors** to make the LoRA compatible with **SVDQuant FLUX** models.
**Script:** [patch_comfyui_nunchaku_lora.py](https://huggingface.co/lym00/comfyui_nunchaku_lora_patch/blob/main/patch_comfyui_nunchaku_lora.py)
**Based on:**
- **Nunchaku Issue:** [ComfyUI-nunchaku #340](https://github.com/mit-han-lab/ComfyUI-nunchaku/issues/340)
> Node Type: `NunchakuFluxLoraLoader`
> Exception Type: `KeyError`
> Exception Message: `'lora_unet_final_layer_adaLN_modulation_1.lora_down.weight'`
- **Example Gist:** [akedia/e0a132b5...](https://gist.github.com/akedia/e0a132b587e30413665d299ad893a60e)
---
## π Example Patch Log
```bash
Running patch_comfyui_nunchaku_lora.py
π Universal final_layer.adaLN LoRA patcher (.safetensors)
Enter path to input LoRA .safetensors file: Flux_kontext_deblur.safetensors
Enter path to save patched LoRA .safetensors file: flux_kontext_deblur_nunchaku.safetensors
β
Loaded 610 tensors from: Flux_kontext_deblur.safetensors
π Found final_layer-related keys:
- lora_unet_final_layer_linear.lora_down.weight
- lora_unet_final_layer_linear.lora_up.weight
π Checking for final_layer keys with prefix 'lora_unet_final_layer'
Linear down: lora_unet_final_layer_linear.lora_down.weight
Linear up: lora_unet_final_layer_linear.lora_up.weight
β
Has final_layer.linear: True
β
Has final_layer.adaLN_modulation_1: False
β
Added dummy adaLN weights:
- lora_unet_final_layer_adaLN_modulation_1.lora_down.weight (shape: torch.Size([16, 3072]))
- lora_unet_final_layer_adaLN_modulation_1.lora_up.weight (shape: torch.Size([64, 16]))
β
Patched file saved to: flux_kontext_deblur_nunchaku.safetensors
Total tensors now: 612
π Verifying patched keys:
- lora_unet_final_layer_adaLN_modulation_1.lora_down.weight
- lora_unet_final_layer_adaLN_modulation_1.lora_up.weight
- lora_unet_final_layer_linear.lora_down.weight
- lora_unet_final_layer_linear.lora_up.weight
β
Contains adaLN after patch: True
|