cross-encoder-RoBERTa-Hinge

Paper All Models GitHub

This model is a cross-encoder based on FacebookAI/roberta-base. It was trained on Ms-Marco using loss hingeLoss as part of a reproducibility paper for training cross encoders: "Reproducing and Comparing Distillation Techniques for Cross-Encoders", see the paper for more details.

Contents

Model Description

This model is intended for re-ranking the top results returned by a retrieval system (like BM25, Bi-Encoders or SPLADE).

  • Training Data: MS MARCO Passage
  • Language: English
  • Loss hingeLoss

Training can be easily reproduced using the assiciated repository. The exact training configuration used for this model is also detailed in config.yaml.

Usage

Quick Start:

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

tokenizer = AutoTokenizer.from_pretrained("FacebookAI/roberta-base")
model = AutoModelForSequenceClassification.from_pretrained("xpmir/cross-encoder-RoBERTa-Hinge")

features = tokenizer("What is experimaestro ?", "Experimaestro is a powerful framework for ML experiments management...", padding=True, truncation=True, return_tensors="pt")

model.eval()
with torch.no_grad():
    scores = model(**features).logits
    print(scores)

Evaluations

We provide evaluations of this cross-encoder re-ranking the top 1000 documents retrieved by naver/splade-v3-distilbert.

dataset RR@10 nDCG@10
msmarco_dev 37.36 43.79
trec2019 91.98 70.80
trec2020 92.96 69.29
fever 79.66 79.64
arguana 20.78 30.82
climate_fever 28.22 20.91
dbpedia 75.65 45.01
fiqa 46.61 38.75
hotpotqa 86.93 70.52
nfcorpus 51.61 30.94
nq 51.91 56.84
quora 75.27 77.80
scidocs 27.30 15.25
scifact 64.78 67.82
touche 60.31 32.87
trec_covid 91.07 70.13
robust04 66.00 44.23
lotte_writing 67.75 57.69
lotte_recreation 61.13 56.06
lotte_science 46.16 38.26
lotte_technology 53.46 44.13
lotte_lifestyle 72.64 63.23
Mean In Domain 74.10 61.29
BEIR 13 58.47 49.02
LoTTE (OOD) 61.19 50.60
Downloads last month
38
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for xpmir/cross-encoder-RoBERTa-Hinge

Finetuned
(2171)
this model

Collection including xpmir/cross-encoder-RoBERTa-Hinge

Paper for xpmir/cross-encoder-RoBERTa-Hinge