Instructions to use HighCWu/FLUX.1-dev-4bit with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use HighCWu/FLUX.1-dev-4bit with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("HighCWu/FLUX.1-dev-4bit", dtype=torch.bfloat16, device_map="cuda") prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k" image = pipe(prompt).images[0] - Notebooks
- Google Colab
- Kaggle
Checkpoint
#2
by MANOFAi94 - opened
Could you please make this into a checkpoint it's the perfect size only about 8gb I need this cause I want to run this on cpu in my 12gb ram phone. Is this already qauntized or is it getting further shrunken if so please make the qauntized version into an checkpoint model please I've been waiting for an 4bit flux1 model but this one u made requires gpu please make cpu version