| | --- |
| | license: apache-2.0 |
| | library_name: diffusers |
| | pipeline_tag: text-to-image |
| | base_model: |
| | - stabilityai/stable-diffusion-3.5-large |
| | --- |
| | # Scale-wise Distillation SD3.5 Large |
| | Scale-wise Distillation (SwD) is a novel framework for accelerating diffusion models (DMs) |
| | by progressively increasing spatial resolution during the generation process. |
| | <br>SwD achieves significant speedups (2.5× to 10×) compared to full-resolution models |
| | while maintaining or even improving image quality. |
| |  |
| |
|
| | Project page: https://yandex-research.github.io/swd <br> |
| | GitHub: https://github.com/yandex-research/swd <br> |
| | Demo: https://huggingface.co/spaces/dbaranchuk/Scale-wise-Distillation |
| |
|
| | ## Usage |
| | Upgrade to the latest version of the [🧨 diffusers](https://github.com/huggingface/diffusers) and [🧨 peft](https://github.com/huggingface/peft) |
| | ``` |
| | pip install -U diffusers |
| | pip install -U peft |
| | ``` |
| |
|
| | and then you can run |
| | <br> |
| | ```py |
| | import torch |
| | from diffusers import StableDiffusion3Pipeline |
| | from peft import PeftModel |
| | |
| | pipe = StableDiffusion3Pipeline.from_pretrained("stabilityai/stable-diffusion-3.5-large", |
| | torch_dtype=torch.float16, |
| | custom_pipeline="quickjkee/swd_pipeline").to("cuda") |
| | lora_path = "yresearch/swd-large-4-steps" |
| | pipe.transformer = PeftModel.from_pretrained( |
| | pipe.transformer, |
| | lora_path, |
| | ) |
| | |
| | prompt = "Cute winter dragon baby, kawaii, Pixar, ultra detailed, glacial background, extremely realistic." |
| | sigmas = [1.0000, 0.8956, 0.7363, 0.6007, 0.0000] |
| | scales = [64, 80, 96, 128] |
| | |
| | image = pipe( |
| | prompt, |
| | sigmas=sigmas, |
| | timesteps=torch.tensor(sigmas[:-1], device="cuda") * 1000, |
| | scales=scales, |
| | guidance_scale=1.0, |
| | height=int(scales[0] * 8), |
| | width=int(scales[0] * 8), |
| | ).images[0] |
| | ``` |
| | <p align="center"> |
| | <img src="large.jpg" width="512px"/> |
| | </p> |
| |
|
| | ## Citation |
| |
|
| | ```bibtex |
| | @inproceedings{ |
| | starodubcev2026scalewise, |
| | title={Scale-wise Distillation of Diffusion Models}, |
| | author={Nikita Starodubcev and Ilya Drobyshevskiy and Denis Kuznedelev and Artem Babenko and Dmitry Baranchuk}, |
| | booktitle={The Fourteenth International Conference on Learning Representations}, |
| | year={2026}, |
| | url={https://openreview.net/forum?id=Z06LNjqU1g} |
| | } |
| | ``` |