Instructions to use Qwen/Qwen3-Reranker-4B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Qwen/Qwen3-Reranker-4B with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen3-Reranker-4B") model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen3-Reranker-4B") - sentence-transformers
How to use Qwen/Qwen3-Reranker-4B with sentence-transformers:
from sentence_transformers import CrossEncoder model = CrossEncoder("Qwen/Qwen3-Reranker-4B") query = "Which planet is known as the Red Planet?" passages = [ "Venus is often called Earth's twin because of its similar size and proximity.", "Mars, known for its reddish appearance, is often referred to as the Red Planet.", "Jupiter, the largest planet in our solar system, has a prominent red spot.", "Saturn, famous for its rings, is sometimes mistaken for the Red Planet." ] scores = model.predict([(query, passage) for passage in passages]) print(scores) - Notebooks
- Google Colab
- Kaggle
update model type and add paper link
#3
by thenlper - opened
README.md
CHANGED
|
@@ -3,6 +3,7 @@ license: apache-2.0
|
|
| 3 |
base_model:
|
| 4 |
- Qwen/Qwen3-4B-Base
|
| 5 |
library_name: transformers
|
|
|
|
| 6 |
---
|
| 7 |
# Qwen3-Reranker-4B
|
| 8 |
|
|
@@ -237,11 +238,10 @@ destroy_model_parallel()
|
|
| 237 |
If you find our work helpful, feel free to give us a cite.
|
| 238 |
|
| 239 |
```
|
| 240 |
-
@
|
| 241 |
-
|
| 242 |
-
|
| 243 |
-
|
| 244 |
-
|
| 245 |
-
year = {2025}
|
| 246 |
}
|
| 247 |
```
|
|
|
|
| 3 |
base_model:
|
| 4 |
- Qwen/Qwen3-4B-Base
|
| 5 |
library_name: transformers
|
| 6 |
+
pipeline_tag: text-ranking
|
| 7 |
---
|
| 8 |
# Qwen3-Reranker-4B
|
| 9 |
|
|
|
|
| 238 |
If you find our work helpful, feel free to give us a cite.
|
| 239 |
|
| 240 |
```
|
| 241 |
+
@article{qwen3embedding,
|
| 242 |
+
title={Qwen3 Embedding: Advancing Text Embedding and Reranking Through Foundation Models},
|
| 243 |
+
author={Zhang, Yanzhao and Li, Mingxin and Long, Dingkun and Zhang, Xin and Lin, Huan and Yang, Baosong and Xie, Pengjun and Yang, An and Liu, Dayiheng and Lin, Junyang and Huang, Fei and Zhou, Jingren},
|
| 244 |
+
journal={arXiv preprint arXiv:2506.05176},
|
| 245 |
+
year={2025}
|
|
|
|
| 246 |
}
|
| 247 |
```
|