redis/langcache-sentencepairs-v3
Viewer • Updated • 82.3M • 321 • 3
How to use redis/langcache-reranker-v2-softmnrl-triplet with sentence-transformers:
from sentence_transformers import CrossEncoder
model = CrossEncoder("redis/langcache-reranker-v2-softmnrl-triplet")
query = "Which planet is known as the Red Planet?"
passages = [
"Venus is often called Earth's twin because of its similar size and proximity.",
"Mars, known for its reddish appearance, is often referred to as the Red Planet.",
"Jupiter, the largest planet in our solar system, has a prominent red spot.",
"Saturn, famous for its rings, is sometimes mistaken for the Red Planet."
]
scores = model.predict([(query, passage) for passage in passages])
print(scores)This is a Cross Encoder model finetuned from Alibaba-NLP/gte-reranker-modernbert-base on the LangCache Sentence Pairs (subsets=['all'], train+val=True) dataset using the sentence-transformers library. It computes scores for pairs of texts, which can be used for sentence pair classification.
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("redis/langcache-reranker-v2-softmnrl-triplet")
# Get scores for pairs of texts
pairs = [
[' What high potential jobs are there other than computer science?', ' What high potential jobs are there other than computer science?'],
[' Would India ever be able to develop a missile system like S300 or S400 missile?', ' Would India ever be able to develop a missile system like S300 or S400 missile?'],
[' water from the faucet is being drunk by a yellow dog', 'A yellow dog is drinking water from the faucet'],
[' water from the faucet is being drunk by a yellow dog', 'The yellow dog is drinking water from a bottle'],
['! colspan = `` 14 `` `` Players who appeared for Colchester who left during the season ``', '! colspan = `` 14 `` `` Players who appeared for Colchester who left during the season ``'],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
' What high potential jobs are there other than computer science?',
[
' What high potential jobs are there other than computer science?',
' Would India ever be able to develop a missile system like S300 or S400 missile?',
'A yellow dog is drinking water from the faucet',
'The yellow dog is drinking water from a bottle',
'! colspan = `` 14 `` `` Players who appeared for Colchester who left during the season ``',
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
anchor, positive, and negative_1| anchor | positive | negative_1 | |
|---|---|---|---|
| type | string | string | string |
| details |
|
|
|
| anchor | positive | negative_1 |
|---|---|---|
|
|
Where can I get a wide variety of wedding dresses in Gold Coast? |
|
|
What's it like having siblings? |
|
|
How do you convince the upcoming generation that "Education is The Key of Success " when we are surrounded by poor graduates and rich criminals? |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"num_negatives": 1,
"activation_fn": "torch.nn.modules.activation.Sigmoid"
}
anchor, positive, and negative_1| anchor | positive | negative_1 | |
|---|---|---|---|
| type | string | string | string |
| details |
|
|
|
| anchor | positive | negative_1 |
|---|---|---|
What high potential jobs are there other than computer science? |
What high potential jobs are there other than computer science? |
Why IT or Computer Science jobs are being over rated than other Engineering jobs? |
Would India ever be able to develop a missile system like S300 or S400 missile? |
Would India ever be able to develop a missile system like S300 or S400 missile? |
Should India buy the Russian S400 air defence missile system? |
water from the faucet is being drunk by a yellow dog |
A yellow dog is drinking water from the faucet |
Childlessness is low in Eastern European countries. |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"num_negatives": 1,
"activation_fn": "torch.nn.modules.activation.Sigmoid"
}
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
Base model
answerdotai/ModernBERT-base