Papers
arxiv:2604.28142

Efficient Multivector Retrieval with Token-Aware Clustering and Hierarchical Indexing

Published on Apr 30
Authors:
,
,
,

Abstract

TACHIOM is a multivector retrieval system that accelerates clustering and retrieval through token-level structure exploitation and graph-based indexing, achieving significant speedups over traditional k-means while maintaining effectiveness.

AI-generated summary

Multivector retrieval models achieve state-of-the-art effectiveness through fine-grained token-level representations, but their deployment incurs substantial computational and memory costs. Current solutions, based on the well-known k-means clustering algorithm, group similar vectors together to enable both effective compression and efficient retrieval. However, standard k-means scales poorly with the number of clusters and dataset size, and favours frequent tokens during training while underrepresenting rare, discriminative ones. In this work, we introduce TACHIOM, a multivector retrieval system that exploits token-level structure to significantly accelerate both clustering and retrieval. By accounting for tokens' distribution during centroid allocation, TACHIOM easily scales to millions of centroids, enabling highly accurate document scoring using only centroids, avoiding expensive token-level computation. TACHIOM combines a graph-based index over centroids with an optimized Product Quantization layout for efficient final scoring. Experiments on MS-MARCOv1 and LoTTE show that TACHIOM achieves up to 247times faster clustering than k-means and up to 9.8times retrieval speedup over state-of-the-art systems while maintaining comparable or superior effectiveness.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2604.28142
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2604.28142 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2604.28142 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2604.28142 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.