File size: 4,189 Bytes
a3f276f
65d0034
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a3f276f
65d0034
 
 
 
 
 
 
 
 
f8e7bd4
65d0034
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: cc-by-4.0
annotations_creators:
- NIST
task_categories:
- text-retrieval
- text-ranking
language:
- en
- zh
multilinguality:
- multilingual
pretty_name: NeuCLIRTech
size_categories:
- n<1K
task_ids:
- document-retrieval
configs:
- config_name: queries
  default: true
  data_files:
  - split: eng
    path: data/eng.tsv
  - split: zho
    path: data/zho.tsv
  format: csv
  sep: "\t"
  header: null
  names: ["id", "query"]
  dataset_info:
    features:
    - name: id
      dtype: string
    - name: query
      dtype: string
- config_name: qrels
  default: false
  data_files: data/qrels.gains.txt
  format: csv
  sep: " "
  header: null
  names: ["id", "ignore", "docid", "relevance"]
  dataset_info:
    features:
    - name: id
      dtype: string
    - name: ignore
      dtype: string
    - name: docid
      dtype: string
    - name: relevance
      dtype: int
---

# NeuCLIRTech Topics and Queries

NeuCLIRTech is an evaluation benchmark for monolingual and cross-language on technical documents

The document collection can be found at [neuclir/csl](https://huggingface.co/datasets/neuclir/csl).

## Supporting Tasks and Corresponding Data

NeuCLIRTech supports three types of tasks: Chinese monolingual and English-Chinese cross-language retrieval.
The following specifies the documents, queries, and qrels (labels) that should be used for each task. 

Please report nDCG@20 for all tasks. 

*We use `:` to indicate different subset under the dataset.*

| Task | Documents | Queries | Qrels |
| --- | --- | --- | --- |
| Monolingual    | `neuclir/csl:csl` | `zho` split of `neuclir/tech:queries` | `neuclir/tech:qrels` |
| Cross-Language | `neuclir/csl:csl` | `eng` split of `neuclir/tech:queries` | `neuclir/tech:qrels` |


## Baseline Retrieval Results and Run Files
We also provide all reported baseline retrieval results in the NeuCLIRBench paper. 
Please refer to the paper for the detailed descriptions of each model. 

![image](https://cdn-uploads.huggingface.co/production/uploads/63a0c07a3c8841cfe2cd1e70/ywTDl59DIIs_9S2pSGdeU.png)


### Run Names
Please refer to the `./runs` directory in this dataset to find all the runs. 
Files follow the naming scheme of `{run_handle}_{task:mono/clir/mlir}_{lang}.trec`. Please refer to the task section for the details. 

| Run Handle        | Model Type         | Model Name            |
|:------------------|:-------------------|:----------------------|
| bm25              | Lexical            | BM25                  |
| bm25dt            | Lexical            | BM25 w/ DT            |
| bm25qt            | Lexical            | BM25 w/ QT            |
| milco             | Bi-Encoder         | MILCO                 |
| plaidx            | Bi-Encoder         | PLAID-X               |
| qwen8b            | Bi-Encoder         | Qwen3 8B Embed        |
| qwen4b            | Bi-Encoder         | Qwen3 4B Embed        |
| qwen600m          | Bi-Encoder         | Qwen3 0.6B Embed      |
| arctic            | Bi-Encoder         | Arctic-Embed Large v2 |
| splade            | Bi-Encoder         | SPLADEv3              |
| fusion3           | Bi-Encoder         | Fusion                |
| repllama          | Bi-Encoder         | RepLlama              |
| me5large          | Bi-Encoder         | e5 Large              |
| jinav3            | Bi-Encoder         | JinaV3                |
| bgem3sparse       | Bi-Encoder         | BGE-M3 Sparse         |
| mt5               | Pointwise Reranker | Mono-mT5XXL           |
| qwen3-0.6b-rerank | Pointwise Reranker | Qwen3 0.6B Rerank     |
| qwen3-4b-rerank   | Pointwise Reranker | Qwen3 4B Rerank       |
| qwen3-8b-rerank   | Pointwise Reranker | Qwen3 8B Rerank       |
| jina-rerank       | Pointwise Reranker | Jina Reranker         |
| searcher-rerank   | Pointwise Reranker | SEARCHER Reranker     |
| rank1             | Pointwise Reranker | Rank1                 |
| qwq               | Listwise Reranker  | Rank-K (QwQ)          |
| rankzephyr        | Listwise Reranker  | RankZephyr 7B         |
| firstqwen         | Listwise Reranker  | FIRST Qwen3 8B        |
| rq32b             | Listwise Reranker  | RankQwen-32B          |


## Citation
TBA