---
language:
- en
tags:
- sentence-transformers
- cross-encoder
- reranker
- generated_from_trainer
- dataset_size:10000
- loss:MSELoss
datasets:
- sentence-transformers/msmarco
pipeline_tag: text-ranking
library_name: sentence-transformers
metrics:
- map
- mrr@10
- ndcg@10
model-index:
- name: CrossEncoder
results:
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoMSMARCO R100
type: NanoMSMARCO_R100
metrics:
- type: map
value: 0.0579
name: Map
- type: mrr@10
value: 0.0329
name: Mrr@10
- type: ndcg@10
value: 0.0479
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNFCorpus R100
type: NanoNFCorpus_R100
metrics:
- type: map
value: 0.2867
name: Map
- type: mrr@10
value: 0.4222
name: Mrr@10
- type: ndcg@10
value: 0.2546
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNQ R100
type: NanoNQ_R100
metrics:
- type: map
value: 0.0326
name: Map
- type: mrr@10
value: 0.01
name: Mrr@10
- type: ndcg@10
value: 0.0229
name: Ndcg@10
- task:
type: cross-encoder-nano-beir
name: Cross Encoder Nano BEIR
dataset:
name: NanoBEIR R100 mean
type: NanoBEIR_R100_mean
metrics:
- type: map
value: 0.1257
name: Map
- type: mrr@10
value: 0.155
name: Mrr@10
- type: ndcg@10
value: 0.1084
name: Ndcg@10
---
# CrossEncoder
This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model trained on the [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) dataset using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
## Model Details
### Model Description
- **Model Type:** Cross Encoder
- **Maximum Sequence Length:** 512 tokens
- **Number of Output Labels:** 1 label
- **Training Dataset:**
- [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco)
- **Language:** en
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder)
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("kselight/123BERT")
# Get scores for pairs of texts
pairs = [
['what is ivana trump', 'The need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign.'],
["hogan's goat meaning", 'hoganâ\x80\x99s goat. The phrase like Hoganâ\x80\x99s goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâ\x80\x99s seminal newspaper comic Hoganâ\x80\x99s Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year.'],
['who made tokyo ghoul', "Tokyo Ghoul (Japanese: æ\x9d±äº¬å\x96°ç¨®ï¼\x88ã\x83\x88ã\x83¼ã\x82\xadã\x83§ã\x83¼ã\x82°ã\x83¼ã\x83«ï¼\x89, Hepburn: TÅ\x8dkyÅ\x8d GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅ\x8dbon volumes as of August 2014."],
['neck of the scottie dog', 'Classical guitars. The classical guitar neck blank is relatively small compared to what is needed for construction. This is because a classical neck is constructed differently than most other neck designs. The heel of the neck is built up by stacking blocks of wood to achieve the necessary height.'],
['what does bicameral mean in government', 'Top 10 amazing movie makeup transformations. In government, bicameralism is the practice of having two legislative or parliamentary chambers. The relationship between the two chambers of a bicameral legislature can vary. In some cases, they have equal power, and in others, one chamber is clearly superior to the other. It is commonplace in most federal systems to have a bicameral legislature.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'what is ivana trump',
[
'The need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign.',
'hoganâ\x80\x99s goat. The phrase like Hoganâ\x80\x99s goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâ\x80\x99s seminal newspaper comic Hoganâ\x80\x99s Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year.',
"Tokyo Ghoul (Japanese: æ\x9d±äº¬å\x96°ç¨®ï¼\x88ã\x83\x88ã\x83¼ã\x82\xadã\x83§ã\x83¼ã\x82°ã\x83¼ã\x83«ï¼\x89, Hepburn: TÅ\x8dkyÅ\x8d GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅ\x8dbon volumes as of August 2014.",
'Classical guitars. The classical guitar neck blank is relatively small compared to what is needed for construction. This is because a classical neck is constructed differently than most other neck designs. The heel of the neck is built up by stacking blocks of wood to achieve the necessary height.',
'Top 10 amazing movie makeup transformations. In government, bicameralism is the practice of having two legislative or parliamentary chambers. The relationship between the two chambers of a bicameral legislature can vary. In some cases, they have equal power, and in others, one chamber is clearly superior to the other. It is commonplace in most federal systems to have a bicameral legislature.',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
```
## Evaluation
### Metrics
#### Cross Encoder Reranking
* Datasets: `NanoMSMARCO_R100`, `NanoNFCorpus_R100` and `NanoNQ_R100`
* Evaluated with [CrossEncoderRerankingEvaluator](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator) with these parameters:
```json
{
"at_k": 10,
"always_rerank_positives": true
}
```
| Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
|:------------|:---------------------|:---------------------|:---------------------|
| map | 0.0579 (-0.4317) | 0.2867 (+0.0257) | 0.0326 (-0.3870) |
| mrr@10 | 0.0329 (-0.4446) | 0.4222 (-0.0777) | 0.0100 (-0.4167) |
| **ndcg@10** | **0.0479 (-0.4925)** | **0.2546 (-0.0705)** | **0.0229 (-0.4778)** |
#### Cross Encoder Nano BEIR
* Dataset: `NanoBEIR_R100_mean`
* Evaluated with [CrossEncoderNanoBEIREvaluator](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderNanoBEIREvaluator) with these parameters:
```json
{
"dataset_names": [
"msmarco",
"nfcorpus",
"nq"
],
"rerank_k": 100,
"at_k": 10,
"always_rerank_positives": true
}
```
| Metric | Value |
|:------------|:---------------------|
| map | 0.1257 (-0.2643) |
| mrr@10 | 0.1550 (-0.3130) |
| **ndcg@10** | **0.1084 (-0.3469)** |
## Training Details
### Training Dataset
#### msmarco
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83)
* Size: 10,000 training samples
* Columns: score, query, and passage
* Approximate statistics based on the first 1000 samples:
| | score | query | passage |
|:--------|:-------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------|
| type | float | string | string |
| details |
6.720487356185913 | modern definition of democracy | Links. A Short Definition of Democracy U.S. president Abraham Lincoln (1809-1865) defined democracy as: «Government of the people, by the people, for the people» Democracy is by far the most challenging form of government-both for politicians and for the people.The term democracy comes from the Greek language and means rule by the (simple) people. The so-called democracies in classical antiquity (Athens and Rome) represent precursors of modern democracies.Like modern democracy, they were created as a reaction to a concentration and abuse of power by the rulers.he term democracy comes from the Greek language and means rule by the (simple) people. The so-called democracies in classical antiquity (Athens and Rome) represent precursors of modern democracies. |
| 1.6529417037963867 | is celexa and fluoxetine same | Celexa (citalopram hydrobromide) is a type of antidepressant called a selective serotonin reuptake inhibitor (SSRI) indicated for the treatment of depression. Celexa is available in generic form. Common side effects of Celexa include. constipation, nausea, diarrhea, upset stomach, decreased sexual desire, |
| -9.121654828389486 | what are 2 examples of nonpoint pollution | Concept of pollution tax. All such measures are compensatory in nature and it is not called pollution tax. The concept of pollution tax is something different. It entails that instead of doing offsetting work by yourself wherever you hurt environment either willfully or without any intention you have to pay for it. |
* Loss: [MSELoss](https://sbert.net/docs/package_reference/cross_encoder/losses.html#mseloss) with these parameters:
```json
{
"activation_fn": "torch.nn.modules.linear.Identity"
}
```
### Evaluation Dataset
#### msmarco
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83)
* Size: 1,000 evaluation samples
* Columns: score, query, and passage
* Approximate statistics based on the first 1000 samples:
| | score | query | passage |
|:--------|:--------------------------------------------------------------------|:----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|
| type | float | string | string |
| details | -11.078993638356527 | what is ivana trump | The need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign. |
| 8.86651055018107 | hogan's goat meaning | hoganâs goat. The phrase like Hoganâs goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâs seminal newspaper comic Hoganâs Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year. |
| 8.381712992986044 | who made tokyo ghoul | Tokyo Ghoul (Japanese: æ±äº¬å°ç¨®ï¼ãã¼ãã§ã¼ã°ã¼ã«ï¼, Hepburn: TÅkyÅ GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅbon volumes as of August 2014. |
* Loss: [MSELoss](https://sbert.net/docs/package_reference/cross_encoder/losses.html#mseloss) with these parameters:
```json
{
"activation_fn": "torch.nn.modules.linear.Identity"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `learning_rate`: 8e-06
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `seed`: 12
- `dataloader_num_workers`: 4
- `load_best_model_at_end`: True
#### All Hyperparameters