File size: 3,523 Bytes
d966252 46edc16 d966252 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
---
license: apache-2.0
datasets:
- KlingTeam/Emo-CFG
language:
- en
base_model:
- Qwen/Qwen2.5-VL-7B-Instruct
pipeline_tag: video-text-to-text
---
<div align=center>
<img src="assets/logo.png" width=15%>
<h1>VidEmo: Affective-Tree Reasoning for Emotion-Centric Video Foundation Models</h1>
<div class="is-size-5 publication-authors">
<span class="author-block">
<a href="https://zzcheng.top/" target="_blank">Zhicheng Zhang</a><sup>1,β </sup>,
</span>
<span class="author-block">
Weicheng Wang<sup>1</sup>,
</span>
<span class="author-block">
<a href="https://yongjie-zhu.github.io/" target="_blank">Yongjie Zhu</a><sup>3,β‘</sup>,
</span>
<span class="author-block">
Wenyu Qin<sup>3</sup>,
</span>
<span class="author-block">
<a href="https://scholar.google.com/citations?user=P6MraaYAAAAJ&hl=en/" target="_blank">Pengfei Wan</a><sup>3</sup>,
</span>
<span class="author-block">
Di Zhang<sup>3</sup>,
</span>
<span class="author-block">
<a href="https://cv.nankai.edu.cn/" target="_blank">Jufeng Yang</a><sup>1,2,β</sup>
</span>
</div>
<!-- Institution -->
<div class="is-size-5 publication-authors">
<sup>1</sup><span class="author-block">Nankai University</span>
<sup>2</sup><span class="author-block">Pengcheng Laboratory</span>
<sup>3</sup><span class="author-block">Kuaishou Technology</span>
</div>
<div class="is-size-5 publication-authors">
<sup>β </sup><span class="author-block">Work done at KlingAI</span>
<sup>β‘</sup><span class="author-block">Project Leader</span>
<sup>β</sup><span class="author-block">Corresponding Author</span>
</div>
**π Accepted by [NeurIPS 2025](https://neurips.cc/virtual/2025/loc/san-diego/poster/115267) π**
<a href="https://arxiv.org/abs/2511.02712" target="_blank">
<img alt="arXiv" src="https://img.shields.io/badge/arXiv-Kling--VidEmo-red?logo=arxiv" height="25" />
</a>
<a href="https://zzcheng.top/VidEmo" target="_blank">
<img alt="Website" src="https://img.shields.io/badge/π_Website-Homepage-blue.svg" height="25" />
</a>
<a href="https://github.com/KlingTeam/VidEmo" target="_blank">
<img alt="Github" src="https://img.shields.io/badge/βοΈ_Github-Code-white.svg" height="25" />
</a>
<a href="https://github.com/nku-zhichengzhang/Awesome-emotion_llm_and_mllm" target="_blank">
<img alt="Awesome" src="https://awesome.re/badge.svg" height="25" />
</a>
<a href="https://zzcheng.top/assets/pdf/2025_NeurIPS_VidEmo_poster.pdf" target="_blank">
<img alt="HF Dataset: Emo-CFG 2.1M" src="https://img.shields.io/badge/π
-Poster-gree.svg" height="25" />
</a>
<br>
<a href="https://huggingface.co/KlingTeam/VidEmo-3B" target="_blank">
<img alt="HF Model: VidEmo Family" src="https://img.shields.io/badge/%F0%9F%A4%97%20_Model-Kling--VidEmo--3B-ffc107?color=ffc107&logoColor=white" height="25" />
</a>
<a href="https://huggingface.co/KlingTeam/VidEmo-7B" target="_blank">
<img alt="HF Model: VidEmo Family" src="https://img.shields.io/badge/%F0%9F%A4%97%20_Model-Kling--VidEmo--7B-ffc107?color=ffc107&logoColor=white" height="25" />
</a>
<a href="https://huggingface.co/datasets/KlingTeam/Emo-CFG" target="_blank">
<img alt="HF Dataset: Emo-CFG 2.1M" src="https://img.shields.io/badge/%F0%9F%A4%97%20_Data-Emo--CFG--2.1M-ffc107?color=ffc107&logoColor=white" height="25" />
</a> |