splade-code-06B / README.md
Tom Aarsen
Attempt to integrate with Sentence Transformers; simplify implementation
d71d855
|
raw
history blame
1.3 kB
metadata
license: cc-by-nc-sa-4.0
tags:
  - sentence-transformers
  - splade
  - sparse-encoder
  - code
pipeline_tag: feature-extraction

SPLADE-Code-06B is a sparse retrieval model designed for code retrieval tasks. It is the top-performing models on MTEB for models below 1B (at time of writing, Feb 2026).

Usage

Using Sentence Transformers

Install Sentence Transformers:

pip install sentence_transformers
from sentence_transformers import SparseEncoder

model = SparseEncoder("naver/splade-code-06B", trust_remote_code=True)

queries = [
    "SELECT *\nFROM Student\nWHERE Age = (\nSELECT MAX(Age)\nFROM Student\nWHERE Group = 'specific_group'\n)\nAND Group = 'specific_group';"
]

query_embeddings = model.encode(queries)
print(query_embeddings.shape)
# torch.Size([1, 151936])

sparsity = model.sparsity(query_embeddings)
print(sparsity)
# {'active_dims': 1231.0, 'sparsity_ratio': 0.991897904380792}

decoded = model.decode(query_embeddings, top_k=10)
print(decoded)
# [[
#     ("Ġgroup", 2.34375),
#     ("Ġage", 2.34375),
#     ("ĠAge", 2.34375),
#     ("ĠStudent", 2.296875),
#     ("Ġspecific", 2.296875),
#     ("_group", 2.296875),
#     ("ĠMax", 2.21875),
#     ("Ġmax", 2.21875),
#     ("Ġstudent", 2.203125),
#     ("ĠGroup", 2.1875),
# ]]