huggan/smithsonian_butterflies_subset
Viewer • Updated • 1k • 2.01k • 56
How to use faverogian/Smithsonian128ControlNet with Diffusers:
pip install -U diffusers transformers accelerate
import torch
from diffusers import DiffusionPipeline
# switch to "mps" for apple devices
pipe = DiffusionPipeline.from_pretrained("faverogian/Smithsonian128ControlNet", dtype=torch.bfloat16, device_map="cuda")
prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k"
image = pipe(prompt).images[0]A pre-trained ControlNet on the Smithsonain Butterflies 128x128 dataset (HuggingFace) using Canny images as conditioning.
An EMA model saved at the conclusion of training (1000 epochs) of an unconditional diffusion model on the Smithsonian Butterflies 128x128 dataset. Trained using the Simple Diffusion paradigm
Full usage details can be found at the GitHub repository page.
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).