Instructions to use zeroentropy/zerank-2-reranker with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use zeroentropy/zerank-2-reranker with sentence-transformers:
from sentence_transformers import CrossEncoder model = CrossEncoder("zeroentropy/zerank-2-reranker") query = "Which planet is known as the Red Planet?" passages = [ "Venus is often called Earth's twin because of its similar size and proximity.", "Mars, known for its reddish appearance, is often referred to as the Red Planet.", "Jupiter, the largest planet in our solar system, has a prominent red spot.", "Saturn, famous for its rings, is sometimes mistaken for the Red Planet." ] scores = model.predict([(query, passage) for passage in passages]) print(scores) - Notebooks
- Google Colab
- Kaggle
File size: 3,690 Bytes
b269504 d713204 b269504 9ae8623 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 | ---
license: cc-by-nc-4.0
language:
- en
base_model:
- Qwen/Qwen3-4B
pipeline_tag: text-ranking
tags:
- finance
- legal
- code
- stem
- medical
library_name: sentence-transformers
---
<img src="https://i.imgur.com/oxvhvQu.png"/>
# Releasing zeroentropy/zerank-2
In search engines, [rerankers are crucial](https://www.zeroentropy.dev/blog/what-is-a-reranker-and-do-i-need-one) for improving the accuracy of your retrieval system.
However, SOTA rerankers are closed-source and proprietary. At ZeroEntropy, we've trained a SOTA reranker outperforming closed-source competitors, and we're launching our model here on HuggingFace.
This reranker [outperforms proprietary rerankers](https://huggingface.co/zeroentropy/zerank-2#evaluations) such as `cohere-rerank-v3.5` and `gemini-2.5-flash` across a wide variety of domains, including finance, legal, code, STEM, medical, and conversational data.
At ZeroEntropy we've developed an innovative multi-stage pipeline that models query-document relevance scores as adjusted [Elo ratings](https://en.wikipedia.org/wiki/Elo_rating_system). See our Technical Report (Coming soon!) for more details.
Since we're a small company, this model is only released under a non-commercial license. If you'd like a commercial license, please contact us at founders@zeroentropy.dev and we'll get you a license ASAP.
## How to Use
```python
from sentence_transformers import CrossEncoder
model = CrossEncoder("zeroentropy/zerank-2", trust_remote_code=True)
query_documents = [
("What is 2+2?", "4"),
("What is 2+2?", "The answer is definitely 1 million"),
]
scores = model.predict(query_documents)
print(scores)
# [0.7531883 0.28894895]
```
The model can also be inferenced using ZeroEntropy's [/models/rerank](https://docs.zeroentropy.dev/api-reference/models/rerank) endpoint, and on [AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-o7avk66msiukc).
## Evaluations
NDCG@10 scores between `zerank-2` and competing closed-source proprietary rerankers. Since we are evaluating rerankers, OpenAI's `text-embedding-3-small` is used as an initial retriever for the Top 100 candidate documents.
| Domain | OpenAI embeddings | ZeroEntropy zerank-2 | ZeroEntropy zerank-1 | Gemini 2.5 Flash (Listwise) | Cohere rerank-3.5 |
|------------------|-------------------|----------------------|----------------------|-----------------------------|-------------------|
| Web | 0.3819 | **0.6346** | 0.6069 | 0.5765 | 0.5594 |
| Conversational | 0.4305 | **0.6140** | 0.5801 | 0.6021 | 0.5648 |
| STEM & Logic | 0.3744 | **0.6521** | 0.6283 | 0.5447 | 0.5418 |
| Code | 0.4582 | **0.6528** | 0.6310 | 0.6128 | 0.5364 |
| Legal | 0.4101 | **0.6644** | 0.6222 | 0.5565 | 0.5257 |
| Biomedical | 0.4783 | **0.7217** | 0.6967 | 0.5371 | 0.6246 |
| Finance | 0.6232 | 0.7600 | 0.7539 | **0.7694** | 0.7402 |
| **Average** | **0.4509** | **0.6714** | **0.6456** | **0.5999** | **0.5847** |
<img src="https://cdn-uploads.huggingface.co/production/uploads/65ec60ccfc59f6e77ecc9ccb/UiDp8LsY4XIdRK5i3CAdD.png" alt="Graph showing the same table" width="1000"/>
|