metadata
language:
- en
tags:
- sentence-transformers
- cross-encoder
- reranker
- generated_from_trainer
- dataset_size:10000
- loss:MSELoss
datasets:
- sentence-transformers/msmarco
pipeline_tag: text-ranking
library_name: sentence-transformers
metrics:
- map
- mrr@10
- ndcg@10
model-index:
- name: CrossEncoder
results:
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoMSMARCO R100
type: NanoMSMARCO_R100
metrics:
- type: map
value: 0.0579
name: Map
- type: mrr@10
value: 0.0329
name: Mrr@10
- type: ndcg@10
value: 0.0479
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNFCorpus R100
type: NanoNFCorpus_R100
metrics:
- type: map
value: 0.2867
name: Map
- type: mrr@10
value: 0.4222
name: Mrr@10
- type: ndcg@10
value: 0.2546
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNQ R100
type: NanoNQ_R100
metrics:
- type: map
value: 0.0326
name: Map
- type: mrr@10
value: 0.01
name: Mrr@10
- type: ndcg@10
value: 0.0229
name: Ndcg@10
- task:
type: cross-encoder-nano-beir
name: Cross Encoder Nano BEIR
dataset:
name: NanoBEIR R100 mean
type: NanoBEIR_R100_mean
metrics:
- type: map
value: 0.1257
name: Map
- type: mrr@10
value: 0.155
name: Mrr@10
- type: ndcg@10
value: 0.1084
name: Ndcg@10
CrossEncoder
This is a Cross Encoder model trained on the msmarco dataset using the sentence-transformers library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
Model Details
Model Description
- Model Type: Cross Encoder
- Maximum Sequence Length: 512 tokens
- Number of Output Labels: 1 label
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Documentation: Cross Encoder Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Cross Encoders on Hugging Face
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("kselight/123BERT")
# Get scores for pairs of texts
pairs = [
['what is ivana trump', 'The need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign.'],
["hogan's goat meaning", 'hoganâ\x80\x99s goat. The phrase like Hoganâ\x80\x99s goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâ\x80\x99s seminal newspaper comic Hoganâ\x80\x99s Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year.'],
['who made tokyo ghoul', "Tokyo Ghoul (Japanese: æ\x9d±äº¬å\x96°ç¨®ï¼\x88ã\x83\x88ã\x83¼ã\x82\xadã\x83§ã\x83¼ã\x82°ã\x83¼ã\x83«ï¼\x89, Hepburn: TÅ\x8dkyÅ\x8d GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅ\x8dbon volumes as of August 2014."],
['neck of the scottie dog', 'Classical guitars. The classical guitar neck blank is relatively small compared to what is needed for construction. This is because a classical neck is constructed differently than most other neck designs. The heel of the neck is built up by stacking blocks of wood to achieve the necessary height.'],
['what does bicameral mean in government', 'Top 10 amazing movie makeup transformations. In government, bicameralism is the practice of having two legislative or parliamentary chambers. The relationship between the two chambers of a bicameral legislature can vary. In some cases, they have equal power, and in others, one chamber is clearly superior to the other. It is commonplace in most federal systems to have a bicameral legislature.'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'what is ivana trump',
[
'The need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign.',
'hoganâ\x80\x99s goat. The phrase like Hoganâ\x80\x99s goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâ\x80\x99s seminal newspaper comic Hoganâ\x80\x99s Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year.',
"Tokyo Ghoul (Japanese: æ\x9d±äº¬å\x96°ç¨®ï¼\x88ã\x83\x88ã\x83¼ã\x82\xadã\x83§ã\x83¼ã\x82°ã\x83¼ã\x83«ï¼\x89, Hepburn: TÅ\x8dkyÅ\x8d GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅ\x8dbon volumes as of August 2014.",
'Classical guitars. The classical guitar neck blank is relatively small compared to what is needed for construction. This is because a classical neck is constructed differently than most other neck designs. The heel of the neck is built up by stacking blocks of wood to achieve the necessary height.',
'Top 10 amazing movie makeup transformations. In government, bicameralism is the practice of having two legislative or parliamentary chambers. The relationship between the two chambers of a bicameral legislature can vary. In some cases, they have equal power, and in others, one chamber is clearly superior to the other. It is commonplace in most federal systems to have a bicameral legislature.',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
Evaluation
Metrics
Cross Encoder Reranking
- Datasets:
NanoMSMARCO_R100,NanoNFCorpus_R100andNanoNQ_R100 - Evaluated with
CrossEncoderRerankingEvaluatorwith these parameters:{ "at_k": 10, "always_rerank_positives": true }
| Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
|---|---|---|---|
| map | 0.0579 (-0.4317) | 0.2867 (+0.0257) | 0.0326 (-0.3870) |
| mrr@10 | 0.0329 (-0.4446) | 0.4222 (-0.0777) | 0.0100 (-0.4167) |
| ndcg@10 | 0.0479 (-0.4925) | 0.2546 (-0.0705) | 0.0229 (-0.4778) |
Cross Encoder Nano BEIR
- Dataset:
NanoBEIR_R100_mean - Evaluated with
CrossEncoderNanoBEIREvaluatorwith these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ], "rerank_k": 100, "at_k": 10, "always_rerank_positives": true }
| Metric | Value |
|---|---|
| map | 0.1257 (-0.2643) |
| mrr@10 | 0.1550 (-0.3130) |
| ndcg@10 | 0.1084 (-0.3469) |
Training Details
Training Dataset
msmarco
- Dataset: msmarco at 9e329ed
- Size: 10,000 training samples
- Columns:
score,query, andpassage - Approximate statistics based on the first 1000 samples:
score query passage type float string string details - min: -11.79
- mean: 0.58
- max: 11.1
- min: 9 characters
- mean: 34.21 characters
- max: 140 characters
- min: 70 characters
- mean: 342.2 characters
- max: 894 characters
- Samples:
score query passage 6.720487356185913modern definition of democracyLinks. A Short Definition of Democracy U.S. president Abraham Lincoln (1809-1865) defined democracy as: «Government of the people, by the people, for the people» Democracy is by far the most challenging form of government-both for politicians and for the people.The term democracy comes from the Greek language and means rule by the (simple) people. The so-called democracies in classical antiquity (Athens and Rome) represent precursors of modern democracies.Like modern democracy, they were created as a reaction to a concentration and abuse of power by the rulers.he term democracy comes from the Greek language and means rule by the (simple) people. The so-called democracies in classical antiquity (Athens and Rome) represent precursors of modern democracies.1.6529417037963867is celexa and fluoxetine sameCelexa (citalopram hydrobromide) is a type of antidepressant called a selective serotonin reuptake inhibitor (SSRI) indicated for the treatment of depression. Celexa is available in generic form. Common side effects of Celexa include. constipation, nausea, diarrhea, upset stomach, decreased sexual desire,-9.121654828389486what are 2 examples of nonpoint pollutionConcept of pollution tax. All such measures are compensatory in nature and it is not called pollution tax. The concept of pollution tax is something different. It entails that instead of doing offsetting work by yourself wherever you hurt environment either willfully or without any intention you have to pay for it. - Loss:
MSELosswith these parameters:{ "activation_fn": "torch.nn.modules.linear.Identity" }
Evaluation Dataset
msmarco
- Dataset: msmarco at 9e329ed
- Size: 1,000 evaluation samples
- Columns:
score,query, andpassage - Approximate statistics based on the first 1000 samples:
score query passage type float string string details - min: -11.85
- mean: 1.11
- max: 11.15
- min: 7 characters
- mean: 34.0 characters
- max: 186 characters
- min: 76 characters
- mean: 343.66 characters
- max: 944 characters
- Samples:
score query passage -11.078993638356527what is ivana trumpThe need for an independent investigation. As it stands, all three men in charge of the investigations into the Trump campaign are Republicans, and two of the three are vociferous Trump allies. Burr, the third, also tied himself to Trump during his close 2016 reelection campaign.8.86651055018107hogan's goat meaninghoganâs goat. The phrase like Hoganâs goat refers to something that is faulty, messed up, or stinks like a goat. The phrase is a reference to R.F. Outcaultâs seminal newspaper comic Hoganâs Alley, which debuted in 1895. The title of the strip changed to The Yellow Kid the following year.8.381712992986044who made tokyo ghoulTokyo Ghoul (Japanese: æ±äº¬å°ç¨®ï¼ãã¼ãã§ã¼ã°ã¼ã«ï¼, Hepburn: TÅkyÅ GÅ«ru) is a Japanese manga series by Sui Ishida. It was serialized in Shueisha's seinen manga magazine Weekly Young Jump between September 2011 and September 2014 and has been collected in fourteen tankÅbon volumes as of August 2014. - Loss:
MSELosswith these parameters:{ "activation_fn": "torch.nn.modules.linear.Identity" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 16per_device_eval_batch_size: 16learning_rate: 8e-06num_train_epochs: 1warmup_ratio: 0.1seed: 12dataloader_num_workers: 4load_best_model_at_end: True
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 16per_device_eval_batch_size: 16per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 8e-06weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 12data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 4dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Trueignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}
Training Logs
| Epoch | Step | Training Loss | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 |
|---|---|---|---|---|---|---|
| -1 | -1 | - | 0.0479 (-0.4925) | 0.2546 (-0.0705) | 0.0229 (-0.4778) | 0.1084 (-0.3469) |
| 0.0064 | 1 | 53.6175 | - | - | - | - |
Framework Versions
- Python: 3.11.6
- Sentence Transformers: 5.1.1
- Transformers: 4.47.1
- PyTorch: 2.4.0+cu124
- Accelerate: 1.5.1
- Datasets: 3.3.2
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}