File size: 1,741 Bytes
7cd196b
 
 
 
 
 
 
0bf66f3
7cd196b
 
 
 
 
8a8aebc
7cd196b
 
 
8a8aebc
7cd196b
8a8aebc
7cd196b
 
 
 
8a8aebc
 
7cd196b
f3bc75a
7cd196b
b27236b
7cd196b
 
 
 
 
 
 
 
 
 
 
b27236b
7cd196b
 
 
 
 
8a8aebc
7cd196b
8a8aebc
7cd196b
8a8aebc
 
b27236b
8a8aebc
 
b27236b
 
8a8aebc
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
license: apache-2.0
language:
- en
base_model:
- Wan-AI/Wan2.1-T2V-1.3B
- Qwen/Qwen-Image-Edit-2509
pipeline_tag: any-to-any
tags:
- video
- image
- stylization
---
<div align="center">

<h1><b>TeleStyle: Content-Preserving Style Transfer in Images and Videos</b></h1>

</div>

<div align="center">
Shiwen Zhang, Xiaoyan Yang, Bojia Zi, Haibin Huang, Chi Zhang, Xuelong Li
  <p>
    Institute of Artificial Intelligence, China Telecom (TeleAI)&nbsp;&nbsp;
  </p>
</div>

<p align="center">
  <a href='https://tele-ai.github.io/TeleStyle/'><img src='https://img.shields.io/badge/Project-Page-Green'></a>
  &nbsp;
  <a href="http://arxiv.org/abs/2601.20175"><img src="https://img.shields.io/static/v1?label=Arxiv&message=TeleStyle&color=red&logo=arxiv"></a>
  &nbsp;
  <a href='https://huggingface.co/Tele-AI/TeleStyle'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Model-orange'></a>
  &nbsp;
  <a href="https://github.com/Tele-AI/TeleStyle">
    <img src="https://img.shields.io/badge/GitHub-Code-red?logo=github">
  </a>
</p>


## 🔔News

- [2026-01-28]: Released [Code](https://github.com/Tele-AI/TeleStyle), [Model](https://huggingface.co/Tele-AI/TeleStyle), [Paper](http://arxiv.org/abs/2601.20175).

## How to use

- Please refer to [**🔗 GitHub**](https://github.com/Tele-AI/TeleStyle) for usage.


## 🌟 Citation

If you find UniVideo useful for your research and applications, please cite using this BibTeX:

```bibtex
@article{teleai2026telestyle,
    title={TeleStyle: Content-Preserving Style Transfer in Images and Videos}, 
    author={Shiwen Zhang and Xiaoyan Yang and Bojia Zi and Haibin Huang and Chi Zhang and Xuelong Li},
    journal={arXiv preprint arXiv:2601.20175},
    year={2026}
}
```