mingyi456 commited on
Commit
6b99db4
·
verified ·
1 Parent(s): 70f0b5c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -10,4 +10,4 @@ base_model:
10
  - black-forest-labs/FLUX.2-klein-base-4B
11
  base_model_relation: quantized
12
  ---
13
- ## Note: It appears to me that the DF11 FLUX.2 [klein] models are bugged, often producing outputs that are slightly different from BF16 outputs. I have confirmed that the DF11 weights are indeed 100% identical to BF16 weights when decompressed, so this should be some bug with inferencing that I am struggling to diagnose. In the meantime, I recommend using the this model with the default KSampler node, instead of the CustomSamplerAdvanced node. Also if you have the VRAM to run this model in BF16, I recommend to use the newly added DFloat11Decompressor node to load this model instead. This node fully decompresses the DF11 weights into BF16 weights, and outputs a model that is 100% identical to the Load Diffusion Model node.
 
10
  - black-forest-labs/FLUX.2-klein-base-4B
11
  base_model_relation: quantized
12
  ---
13
+ ## Note: It appears to me that the DF11 FLUX.2 [klein] models are bugged, often producing outputs that are slightly different from BF16 outputs. I have confirmed that the DF11 weights are indeed 100% identical to BF16 weights when decompressed, so this should be some bug with inferencing that I am struggling to diagnose. In the meantime, I recommend using the this model with the default KSampler node, instead of the CustomSamplerAdvanced node. Also if you have the VRAM to run this model in BF16, I recommend to use the newly added DFloat11Decompressor node to load this model instead. This node fully decompresses the DF11 weights into BF16 weights, and outputs a model that is 100% identical to the Load Diffusion Model node. I apologize for the inconvenience, as I investigate this issue.