mingyi456's picture
Update README.md
6b99db4 verified
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-to-image
tags:
  - comfyui
  - diffusion-single-file
base_model:
  - black-forest-labs/FLUX.2-klein-base-4B
base_model_relation: quantized

Note: It appears to me that the DF11 FLUX.2 [klein] models are bugged, often producing outputs that are slightly different from BF16 outputs. I have confirmed that the DF11 weights are indeed 100% identical to BF16 weights when decompressed, so this should be some bug with inferencing that I am struggling to diagnose. In the meantime, I recommend using the this model with the default KSampler node, instead of the CustomSamplerAdvanced node. Also if you have the VRAM to run this model in BF16, I recommend to use the newly added DFloat11Decompressor node to load this model instead. This node fully decompresses the DF11 weights into BF16 weights, and outputs a model that is 100% identical to the Load Diffusion Model node. I apologize for the inconvenience, as I investigate this issue.