FLUX.2 [klein] 4B โ€” mflux 4-bit quantized

4-bit quantized weights of FLUX.2 [klein] 4B by Black Forest Labs, optimized for mflux on Apple Silicon.

Full precision This repo (4-bit)
Size ~8 GB 4.3 GB
Framework diffusers / mflux mflux only
Hardware CUDA / MLX Apple Silicon (MLX)

Quickstart

Install mflux

pip install mflux

Generate an image

mflux-generate-flux2 \
  --model RunPod/FLUX.2-klein-4B-mflux-4bit \
  --prompt "A cute robot standing in a field of flowers, digital art" \
  --width 1024 \
  --height 1024 \
  --steps 4 \
  --seed 42 \
  --output output.png

Python usage

from mflux import Flux2

flux = Flux2(
    model="RunPod/FLUX.2-klein-4B-mflux-4bit",
    base_model="flux2-klein-4b",
)

image = flux.generate_image(
    prompt="A cute robot standing in a field of flowers, digital art",
    width=1024,
    height=1024,
    num_inference_steps=4,
    seed=42,
)

image.save("output.png")

Details

  • Base model: black-forest-labs/FLUX.2-klein-4B (Apache 2.0)
  • Quantization: 4-bit via MLX nn.quantize (group_size=64), created with mflux-save --quantize 4
  • Requirements: mflux v0.16.0+, Apple Silicon Mac
  • Performance: ~11s for 512x512 (4 steps) on M3 Pro 18GB

How this was created

pip install mflux
mflux-save \
  --path ./FLUX.2-klein-4B-mflux-4bit \
  --model flux2-klein-4b \
  --quantize 4

About FLUX.2 [klein]

FLUX.2 [klein] 4B is a 4 billion parameter rectified flow transformer by Black Forest Labs for fast image generation and editing. It delivers state-of-the-art quality with sub-second inference on consumer hardware.

  • Ultra-fast inference (4 steps)
  • Text-to-image and multi-reference image editing
  • Apache 2.0 โ€” fully open for commercial use
  • Blog post | GitHub

License

Apache 2.0, inherited from the original model.

Credits

Downloads last month

-

Downloads are not tracked for this model. How to track
MLX
Hardware compatibility
Log In to add your hardware

Quantized

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RunPod/FLUX.2-klein-4B-mflux-4bit

Finetuned
(5)
this model