For GPUs with 12GB VRAM, try this setup

#1
by madtune - opened

For GPUs with 12GB VRAM, try this setup

This configuration uses device_map="balanced" with explicit memory limits to make FLUX run on 12GB GPUs without OOM issues.

import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained(
    pretrained_model_name_or_path="kpsss34/FHDR_Uncensored", 
    device_map="balanced",
    max_memory={0: "11GB", 1: "11GB"}, 
    torch_dtype=torch.bfloat16
)
#pipe.enable_model_cpu_offload()

prompt = "a women..."
image = pipe(
    prompt,
    height=1024,
    width=1024,
    guidance_scale=4.0,
    num_inference_steps=40,
    max_sequence_length=512,
    generator=torch.Generator("cpu").manual_seed(0)
).images[0]

image.save("outputs.png")

thx bro,

Sign up or log in to comment