Add metadata and improve model card

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +21 -6
README.md CHANGED
@@ -1,19 +1,21 @@
1
  ---
 
 
2
  datasets:
3
  - VLM2Vec/MMEB-V2
4
  language:
5
  - en
6
- base_model:
7
- - Qwen/Qwen2-VL-2B-Instruct
8
  ---
 
9
  # PLUME-Qwen2-VL-2B
10
 
11
  **PLUME: Latent Reasoning Based Universal Multimodal Embedding**
12
 
13
- PLUME is a latent reasoning framework for universal multimodal embedding (UME). It replaces explicit chain-of-thought (CoT) generation with a short autoregressive rollout of continuous latent states, achieving stronger retrieval
14
- performance while delivering **over 30x faster inference** compared to explicit-CoT methods.
15
 
16
- **[Project Page](https://haoxiangzhao12138.github.io/PLUME/)** | **[Paper](https://arxiv.org/abs/2507.00001)** | **[Code](https://github.com/haoxiangzhao12138/PLUME)**
17
 
18
  ## Highlights
19
 
@@ -43,5 +45,18 @@ huggingface-cli download CUDAOUTOFMEMORY/PLUME-Qwen2-VL-2B --local-dir /path/to/
43
  # Option 2: git clone (requires git-lfs)
44
  git lfs install
45
  git clone https://huggingface.co/CUDAOUTOFMEMORY/PLUME-Qwen2-VL-2B
46
-
 
 
 
 
 
 
 
 
 
 
 
 
 
47
  ```
 
1
  ---
2
+ base_model:
3
+ - Qwen/Qwen2-VL-2B-Instruct
4
  datasets:
5
  - VLM2Vec/MMEB-V2
6
  language:
7
  - en
8
+ library_name: transformers
9
+ pipeline_tag: feature-extraction
10
  ---
11
+
12
  # PLUME-Qwen2-VL-2B
13
 
14
  **PLUME: Latent Reasoning Based Universal Multimodal Embedding**
15
 
16
+ PLUME is a latent reasoning framework for universal multimodal embedding (UME). It replaces explicit chain-of-thought (CoT) generation with a short autoregressive rollout of continuous latent states, achieving stronger retrieval performance while delivering **over 30x faster inference** compared to explicit-CoT methods.
 
17
 
18
+ **[Project Page](https://haoxiangzhao12138.github.io/PLUME/)** | **[Paper](https://arxiv.org/abs/2604.02073)** | **[Code](https://github.com/haoxiangzhao12138/PLUME)**
19
 
20
  ## Highlights
21
 
 
45
  # Option 2: git clone (requires git-lfs)
46
  git lfs install
47
  git clone https://huggingface.co/CUDAOUTOFMEMORY/PLUME-Qwen2-VL-2B
48
+ ```
49
+
50
+ ## Citation
51
+
52
+ ```bibtex
53
+ @misc{he2026plumelatentreasoningbased,
54
+ title={PLUME: Latent Reasoning Based Universal Multimodal Embedding},
55
+ author={Chenwei He and Xiangzhao Hao and Tianyu Yang and Yuxiang Ma and Yuheng Jia and Lingxiang Wu and Chaoyang Zhao and Haiyun Guo and Jinqiao Wang},
56
+ year={2026},
57
+ eprint={2604.02073},
58
+ archivePrefix={arXiv},
59
+ primaryClass={cs.CV},
60
+ url={https://arxiv.org/abs/2604.02073},
61
+ }
62
  ```