Update README.md
Browse files
README.md
CHANGED
|
@@ -7,16 +7,16 @@ base_model:
|
|
| 7 |
pipeline_tag: text-to-image
|
| 8 |
library_name: diffusers
|
| 9 |
---
|
| 10 |
-
# AMD Nitro
|
| 11 |
|
| 12 |
|
| 13 |

|
| 14 |
|
| 15 |
## Introduction
|
| 16 |
-
|
| 17 |
|
| 18 |
-
* [
|
| 19 |
-
* [PixArt
|
| 20 |
|
| 21 |
⚡️ [Open-source code](https://github.com/AMD-AIG-AIMA/AMD-Diffusion-Distillation)! The models are based on our re-implementation of [Latent Adversarial Diffusion Distillation](https://arxiv.org/abs/2403.12015), the method used to build the popular Stable Diffusion 3 Turbo model. Since the original authors didn't provide training code, we release our re-implementation to help advance further research in the field.
|
| 22 |
|
|
@@ -24,9 +24,9 @@ AMD Nitro Diffusion is a series of efficient text-to-image generation models tha
|
|
| 24 |
|
| 25 |
## Details
|
| 26 |
|
| 27 |
-
* **Model architecture**: PixArt
|
| 28 |
* **Inference steps**: This model is distilled to perform inference in just a single step. However, the training code also supports distilling a model for 2, 4 or 8 steps.
|
| 29 |
-
* **Hardware**: We use a single node consisting of 4 AMD Instinct™ MI250 GPUs for distilling PixArt
|
| 30 |
* **Dataset**: We use 1M prompts from [DiffusionDB](https://huggingface.co/datasets/poloclub/diffusiondb) and generate the corresponding images from the base PixArt-Sigma model.
|
| 31 |
* **Training cost**: The distillation process achieves reasonable results in less than 2 days on a single node.
|
| 32 |
|
|
@@ -64,7 +64,7 @@ Compared to [PixArt-Sigma](https://pixart-alpha.github.io/PixArt-sigma-project/)
|
|
| 64 |
| Model | FID ↓ | CLIP ↑ |FLOPs| Latency on AMD Instinct MI250 (sec)
|
| 65 |
| :---: | :---: | :---: | :---: | :---:
|
| 66 |
| PixArt-Sigma, 20 steps | 34.14 | 0.3289 |187.96 | 7.46
|
| 67 |
-
| **PixArt
|
| 68 |
|
| 69 |
|
| 70 |
|
|
|
|
| 7 |
pipeline_tag: text-to-image
|
| 8 |
library_name: diffusers
|
| 9 |
---
|
| 10 |
+
# AMD Nitro-1
|
| 11 |
|
| 12 |
|
| 13 |

|
| 14 |
|
| 15 |
## Introduction
|
| 16 |
+
Nitro-1 is a series of efficient text-to-image generation models that are distilled from popular diffusion models on AMD Instinct™ GPUs. The release consists of:
|
| 17 |
|
| 18 |
+
* [Nitro-1-SD](https://huggingface.co/amd/SD2.1-Nitro): a UNet-based one-step model distilled from [Stable Diffusion 2.1](https://huggingface.co/stabilityai/stable-diffusion-2-1-base).
|
| 19 |
+
* [Nitro-1-PixArt](https://huggingface.co/amd/PixArt-Sigma-Nitro): a high resolution transformer-based one-step model distilled from [PixArt-Sigma](https://pixart-alpha.github.io/PixArt-sigma-project/).
|
| 20 |
|
| 21 |
⚡️ [Open-source code](https://github.com/AMD-AIG-AIMA/AMD-Diffusion-Distillation)! The models are based on our re-implementation of [Latent Adversarial Diffusion Distillation](https://arxiv.org/abs/2403.12015), the method used to build the popular Stable Diffusion 3 Turbo model. Since the original authors didn't provide training code, we release our re-implementation to help advance further research in the field.
|
| 22 |
|
|
|
|
| 24 |
|
| 25 |
## Details
|
| 26 |
|
| 27 |
+
* **Model architecture**: Nitro-1-PixArt has the same architecture as PixArt-Sigma and is compatible with the diffusers pipeline.
|
| 28 |
* **Inference steps**: This model is distilled to perform inference in just a single step. However, the training code also supports distilling a model for 2, 4 or 8 steps.
|
| 29 |
+
* **Hardware**: We use a single node consisting of 4 AMD Instinct™ MI250 GPUs for distilling Nitro-1-PixArt.
|
| 30 |
* **Dataset**: We use 1M prompts from [DiffusionDB](https://huggingface.co/datasets/poloclub/diffusiondb) and generate the corresponding images from the base PixArt-Sigma model.
|
| 31 |
* **Training cost**: The distillation process achieves reasonable results in less than 2 days on a single node.
|
| 32 |
|
|
|
|
| 64 |
| Model | FID ↓ | CLIP ↑ |FLOPs| Latency on AMD Instinct MI250 (sec)
|
| 65 |
| :---: | :---: | :---: | :---: | :---:
|
| 66 |
| PixArt-Sigma, 20 steps | 34.14 | 0.3289 |187.96 | 7.46
|
| 67 |
+
| **Nitro-1-PixArt**, 1 step | 37.75 | 0.3167|17.04 | 0.53
|
| 68 |
|
| 69 |
|
| 70 |
|