gguf quantized models for Flux-Krea for usage in ComfyUI
how to use
- put the gguf model in
comfyui/models/diffusion_models - git clone city96 custom node into your
custom_nodesfolder in comfyui - cd into
ComfyUI-GGUFinsidecustom_nodesand runpip install -r requirements.txt - load the model with the unet loader node in any flux workflow
warning: q4, and especially q2 versions may have noticeable quality loss.
- Downloads last month
- 111
Hardware compatibility
Log In to add your hardware
2-bit
4-bit
6-bit
8-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support