File size: 1,344 Bytes
7cb9510
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
license: mit
---
# LongD-CLIP
# Retaining Knowledge and Enhancing Long-Text Representations in CLIP through Dual-Teacher Distillation

๐Ÿ“„ **CVPR 2025**

This repository provides resources for our CVPR 2025 paper:  
**"Retaining Knowledge and Enhancing Long-Text Representations in CLIP through Dual-Teacher Distillation"**

---

## ๐Ÿ” Introduction

Our work focuses on **improving CLIPโ€™s ability to handle long-text inputs** while retaining its original knowledge.  
We propose a **Dual-Teacher Distillation** framework that:
- Retains knowledge from the original CLIP,
- Enhances long-text representations through teacher guidance,

This work **extends the research line of [Long-CLIP](https://github.com/beichenzbc/Long-CLIP)** and further advances long-text representation learning in multimodal models.  
๐Ÿ‘‰ The implementation can also **refer to [LongD-CLIP](https://github.com/yourname/LongD-CLIP)**.

---

## ๐Ÿš€ Resources

- **Paper**: [CVPR 2025 proceedings](https://openaccess.thecvf.com/content/CVPR2025/papers/Feng_Retaining_Knowledge_and_Enhancing_Long-Text_Representations_in_CLIP_through_Dual-Teacher_CVPR_2025_paper.pdf)  
- **Model Weights**: [Hugging Face โ€“ LongD-CLIP](https://huggingface.co/BruceFeng98/LongD-CLIP/tree/main)  
- **Related Codebase**: [Long-CLIP](https://github.com/beichenzbc/Long-CLIP)

---