File size: 3,019 Bytes
0516c8b
 
 
 
 
0d86784
0516c8b
 
 
 
 
 
 
 
 
 
 
 
d29df4f
 
 
0516c8b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
---
license: apache-2.0
---
# FLUX-Makeup: High-Fidelity, Identity-Consistent, and Robust Makeup Transfer via Diffusion Transformer

[![arXiv](https://img.shields.io/badge/arXiv-2508.05069-B31B1B?style=flat)](https://arxiv.org/abs/2508.05069)

![examples](example.png)

We propose **FLUX-Makeup**, a high-fidelity, identityconsistent, and robust makeup transfer framework that eliminates the need for any auxiliary face-control components.

## 💪 Highlight Feature


- **Core Strengths:** FLUX-Makeup delivers high-fidelity, consistent, and robust makeup transfer using only a 'source + reference' input—the most natural interaction format—without the need for any additional facial control modules.
- **Data Engine & HQMT Dataset:** We developed an extensible, filterable, and quality-controlled data generation pipeline, and curated a high-quality paired makeup dataset containing over 50,000 samples, named HQMT.
- **Architecture: Decoupled Feature Injection:** Through RefLoRAInjector, we define two sets of low-rank projections to achieve precise extraction of makeup-related information while effectively preventing identity collapse and background distortion.

## 💡 Github
[FLUX-Makeup](https://github.com/360CVGroup/FLUX-Makeup)

## 🧩 Environment Setup


```
pip install -r requirements.txt
```

## 📂 Preparation of Pretrained Models

We provide pretrained weights for evaluation and deployment. Please download the `checkpoint.pt` files from [here](https://huggingface.co/qihoo360/FLUX-Makeup) and place them in the model directory.
Also, please download the `79999_iter.pth` files from [here](https://huggingface.co/qihoo360/FLUX-Makeup) and place them in the model directory.
In addition, download the pretrained weights of **Flux-Kontext.dev** from [here](https://huggingface.co/black-forest-labs/FLUX.1-Kontext-dev) to serve as the backbone model.

## 📂 ComfyUI

You can use ComfyUI within Flux_Makeup_ComfyUI, where the model folders need to be placed in:

`Flux_Makeup_ComfyUI/models/stable_makeup`
  

  

## ⏳ Inference Pipeline

Here we provide the inference code for our FLUX-Makeup.

  

  

```
sh eval.sh
```


## 🌸 Acknowledgement

  

  

This code is mainly built upon [Diffusers](https://github.com/huggingface/diffusers/tree/main), [Flux](https://github.com/huggingface/diffusers/tree/main/src/diffusers/pipelines/flux), [Stable-Makeup](https://github.com/Xiaojiu-z/Stable-Makeup) and [ComfyUI_Stable_Makeup](https://github.com/smthemex/ComfyUI_Stable_Makeup) repositories. Thanks so much for their solid work!




If you find this repository useful, please consider citing our paper:
```
@article{zhu2025flux,
  title={FLUX-Makeup: High-Fidelity, Identity-Consistent, and Robust Makeup Transfer via Diffusion Transformer},
  author={Zhu, Jian and Liu, Shanyuan and Li, Liuzhuozheng and Gong, Yue and Wang, He and Cheng, Bo and Ma, Yuhang and Wu, Liebucha and Wu, Xiaoyu and Leng, Dawei and others},
  journal={arXiv preprint arXiv:2508.05069},
  year={2025}
}

```