---
tags:
- sentence-transformers
- cross-encoder
- reranker
- generated_from_trainer
- dataset_size:87398
- loss:CrossEntropyLoss
base_model: deepvk/USER-bge-m3
pipeline_tag: text-classification
library_name: sentence-transformers
metrics:
- f1_macro
- f1_micro
- f1_weighted
model-index:
- name: CrossEncoder based on deepvk/USER-bge-m3
results:
- task:
type: cross-encoder-softmax-accuracy
name: Cross Encoder Softmax Accuracy
dataset:
name: softmax accuracy eval
type: softmax_accuracy_eval
metrics:
- type: f1_macro
value: 0.9715485242270209
name: F1 Macro
- type: f1_micro
value: 0.9743012183884509
name: F1 Micro
- type: f1_weighted
value: 0.974262256621189
name: F1 Weighted
---
# CrossEncoder based on deepvk/USER-bge-m3
This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [deepvk/USER-bge-m3](https://huggingface.co/deepvk/USER-bge-m3) using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for text pair classification.
## Model Details
### Model Description
- **Model Type:** Cross Encoder
- **Base model:** [deepvk/USER-bge-m3](https://huggingface.co/deepvk/USER-bge-m3)
- **Maximum Sequence Length:** 8192 tokens
- **Number of Output Labels:** 2 labels
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder)
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import CrossEncoder
# Download from the π€ Hub
model = CrossEncoder("Chimalpopoka/CrossEncoderRanker")
# Get scores for pairs of texts
pairs = [
['ΠΠ°Π½Π΅Π»Ρ Π°Π»Π»Π΅ΡΠ³Π΅Π½ΠΎΠ² ΠΏΡΠ»ΠΈ β 1 IgE (Π΄ΠΎΠΌΠ°ΡΠ½ΡΡ ΠΏΡΠ»Ρ (Greer), ΠΊΠ»Π΅Ρ-Π΄Π΅ΡΠΌΠ°ΡΠΎΡΠ°Π³ ΠΏΠ΅ΡΠΈΠ½Π½ΡΠΉ, ΠΊΠ»Π΅Ρ-Π΄Π΅ΡΠΌΠ°ΡΠΎΡΠ°Π³ ΠΌΡΡΠ½ΠΎΠΉ, ΡΠ°ΡΠ°ΠΊΠ°Π½)', 'Π‘ΠΌΠ΅ΡΡ Π°Π»Π»Π΅ΡΠ³Π΅Π½ΠΎΠ² ΠΏΡΠ»ΠΈ - hm1, Π‘ΠΎΡΡΠ°Π²: Π΄ΠΎΠΌΠ°ΡΠ½ΡΡ ΠΏΡΠ»Ρ, Dermatophagoides pteronyssinus, Dermatophagoides farinae, ΡΠ°ΡΠ°ΠΊΠ°Π½-ΠΏΡΡΡΠ°ΠΊ, IgE. ΠΠ΅ΡΠΎΠ΄: ΠΠ€Π'],
['ΠΠΈΠ΄ΠΊΠΎΡΡΠ½Π°Ρ ΡΠΈΡΠΎΠ»ΠΎΠ³ΠΈΡ Π Π¨Π', 'ΠΠΈΠ΄ΠΊΠΎΡΡΠ½Π°Ρ ΡΠΈΡΠΎΠ»ΠΎΠ³ΠΈΡ. ΠΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠ΅ ΡΠΎΡΠΊΠΎΠ±Π° ΡΠ΅ΠΉΠΊΠΈ ΠΌΠ°ΡΠΊΠΈ ΠΈ ΡΠ΅ΡΠ²ΠΈΠΊΠ°Π»ΡΠ½ΠΎΠ³ΠΎ ΠΊΠ°Π½Π°Π»Π° (ΠΎΠΊΡΠ°ΡΠΈΠ²Π°Π½ΠΈΠ΅ ΠΏΠΎ ΠΠ°ΠΏΠ°Π½ΠΈΠΊΠΎΠ»Π°Ρ)'],
['ΠΠΎΡΠ΅Π² Π½Π° Π²ΠΎΠ·Π±ΡΠ΄ΠΈΡΠ΅Π»Π΅ΠΉ ΠΊΠΈΡΠ΅ΡΠ½ΠΎΠΉ ΠΈΠ½ΡΠ΅ΠΊΡΠΈΠΈ (ΡΠ°Π»ΡΠΌΠΎΠ½Π΅Π»Π»Ρ, ΡΠΈΠ³Π΅Π»Π»Ρ) Ρ ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½ΠΈΠ΅ΠΌ ΡΡΠ²ΡΡΠ²ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΊ ΠΎΡΠ½ΠΎΠ²Π½ΠΎΠΌΡ ΡΠΏΠ΅ΠΊΡΡΡ Π°Π½ΡΠΈΠ±ΠΈΠΎΡΠΈΠΊΠΎΠ²', 'ΠΠΎΡΠ΅Π² ΠΊΠ°Π»Π° Π½Π° ΠΏΠ°ΡΠΎΠ³Π΅Π½Π½ΡΡ ΡΠ»ΠΎΡΡ (Π΄ΠΈΠ·Π΅Π½ΡΠ΅ΡΠΈΠΉΠ½Π°Ρ ΠΈ ΡΠΈΡΠΎΠΏΠ°ΡΠ°ΡΠΈΡΠΎΠ·Π½Π°Ρ Π³ΡΡΠΏΠΏΡ): Π‘ ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½ΠΈΠ΅ΠΌ ΡΡΠ²ΡΡΠ²ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΊ Π°Π½ΡΠΈΠ±ΠΈΠΎΡΠΈΠΊΠ°ΠΌ. ΠΠ΅ΡΠΎΠ΄: ΠΊΡΠ»ΡΡΡΡΠ°Π»ΡΠ½ΡΠΉ'],
['ΠΠΎΠ»Π΅ΠΊΡΠ»ΡΡΠ½ΠΎ-Π³Π΅Π½Π΅ΡΠΈΡΠ΅ΡΠΊΠΎΠ΅ ΠΈΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠ΅ ΠΌΡΡΠ°ΡΠΈΠΈ Π² Π³Π΅Π½Π΅ V617F (Π·Π°ΠΌΠ΅Π½Π° 617-ΠΎΠΉ Π°ΠΌΠΈΠ½ΠΎΠΊΠΈΡΠ»ΠΎΡΡ Ρ Π²Π°Π»ΠΈΠ½Π° Π½Π° ΡΠ΅Π½ΠΈΠ»Π°Π»Π°Π½ΠΈΠ½) JAK2 (ΡΠ½ΡΡ ΡΠΈΡΠΎΠ·ΠΈΠ½-ΠΊΠΈΠ½Π°Π·Π° Π²ΡΠΎΡΠΎΠ³ΠΎ ΡΠΈΠΏΠ° / ΠΠ°ΡΠ΅ΡΡΠ²Π΅Π½Π½Π°Ρ ΠΎΡΠ΅Π½ΠΊΠ° Π½Π°Π»ΠΈΡΠΈΡ ΡΠΎΠΌΠ°ΡΠΈΡΠ΅ΡΠΊΠΎΠΉ ΠΌΡΡΠ°ΡΠΈΠΈ V617F Π² 14 ΡΠΊΠ·ΠΎΠ½Π΅ Π³Π΅Π½Π° JAK2 (Qualitative assessment of presence of gene JAK2 617F somatic mutation)', 'ΠΠ½Π°Π»ΠΈΠ· ΠΌΡΡΠ°ΡΠΈΠΈ V617F Π³Π΅Π½Π° JAK2 (Π·Π°ΠΌΠ΅Π½Π° Π²Π°Π»ΠΈΠ½ Π½Π° ΡΠ΅Π½ΠΈΠ»Π°Π»Π°Π½ΠΈΠ½). ΠΠ΅ΡΠΎΠ΄: ΠΠ¦Π '],
['ΠΠΎΠ΄ΠΎΡΠΎΠ΄Π½ΠΎ-ΠΌΠ΅ΡΠ°Π½ΠΎΠ²ΡΠΉ Π΄ΡΡ
Π°ΡΠ΅Π»ΡΠ½ΡΠΉ ΡΠ΅ΡΡ Ρ Π»Π°ΠΊΡΡΠ»ΠΎΠ·ΠΎΠΉ (Π‘ΠΠΠ Π’ΠΠ‘Π’, ΡΠΈΠ½Π΄ΡΠΎΠΌ ΠΈΠ·Π±ΡΡΠΎΡΠ½ΠΎΠ³ΠΎ Π±Π°ΠΊΡΠ΅ΡΠΈΠ°Π»ΡΠ½ΠΎΠ³ΠΎ ΡΠΎΡΡΠ° Π² ΡΠΎΠ½ΠΊΠΎΠΉ ΠΊΠΈΡΠΊΠ΅, Π‘ΠΠΠ ) (ΡΠ°ΠΌΠΎΡΡΠΎΡΡΠ΅Π»ΡΠ½ΠΎΠ΅ Π²Π·ΡΡΠΈΠ΅ ΠΏΡΠΎΠ±)', 'ΠΡΡ
Π°ΡΠ΅Π»ΡΠ½ΡΠΉ Π²ΠΎΠ΄ΠΎΡΠΎΠ΄Π½ΡΠΉ ΡΠ΅ΡΡ Π½Π° Π‘ΠΠΠ '],
]
scores = model.predict(pairs)
print(scores.shape)
# (5, 2)
```
## Evaluation
### Metrics
#### Cross Encoder Softmax Accuracy
* Dataset: `softmax_accuracy_eval`
* Evaluated with [CESoftmaxAccuracyEvaluator](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CESoftmaxAccuracyEvaluator)
| Metric | Value |
|:-------------|:-----------|
| **f1_macro** | **0.9715** |
| f1_micro | 0.9743 |
| f1_weighted | 0.9743 |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 87,398 training samples
* Columns: sentence_0, sentence_1, and label
* Approximate statistics based on the first 1000 samples:
| | sentence_0 | sentence_1 | label |
|:--------|:-----------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------|:------------------------------------------------|
| type | string | string | int |
| details |
ΠΠ°Π½Π΅Π»Ρ Π°Π»Π»Π΅ΡΠ³Π΅Π½ΠΎΠ² ΠΏΡΠ»ΠΈ β 1 IgE (Π΄ΠΎΠΌΠ°ΡΠ½ΡΡ ΠΏΡΠ»Ρ (Greer), ΠΊΠ»Π΅Ρ-Π΄Π΅ΡΠΌΠ°ΡΠΎΡΠ°Π³ ΠΏΠ΅ΡΠΈΠ½Π½ΡΠΉ, ΠΊΠ»Π΅Ρ-Π΄Π΅ΡΠΌΠ°ΡΠΎΡΠ°Π³ ΠΌΡΡΠ½ΠΎΠΉ, ΡΠ°ΡΠ°ΠΊΠ°Π½) | Π‘ΠΌΠ΅ΡΡ Π°Π»Π»Π΅ΡΠ³Π΅Π½ΠΎΠ² ΠΏΡΠ»ΠΈ - hm1, Π‘ΠΎΡΡΠ°Π²: Π΄ΠΎΠΌΠ°ΡΠ½ΡΡ ΠΏΡΠ»Ρ, Dermatophagoides pteronyssinus, Dermatophagoides farinae, ΡΠ°ΡΠ°ΠΊΠ°Π½-ΠΏΡΡΡΠ°ΠΊ, IgE. ΠΠ΅ΡΠΎΠ΄: ΠΠ€Π | 1 |
| ΠΠΈΠ΄ΠΊΠΎΡΡΠ½Π°Ρ ΡΠΈΡΠΎΠ»ΠΎΠ³ΠΈΡ Π Π¨Π | ΠΠΈΠ΄ΠΊΠΎΡΡΠ½Π°Ρ ΡΠΈΡΠΎΠ»ΠΎΠ³ΠΈΡ. ΠΡΡΠ»Π΅Π΄ΠΎΠ²Π°Π½ΠΈΠ΅ ΡΠΎΡΠΊΠΎΠ±Π° ΡΠ΅ΠΉΠΊΠΈ ΠΌΠ°ΡΠΊΠΈ ΠΈ ΡΠ΅ΡΠ²ΠΈΠΊΠ°Π»ΡΠ½ΠΎΠ³ΠΎ ΠΊΠ°Π½Π°Π»Π° (ΠΎΠΊΡΠ°ΡΠΈΠ²Π°Π½ΠΈΠ΅ ΠΏΠΎ ΠΠ°ΠΏΠ°Π½ΠΈΠΊΠΎΠ»Π°Ρ) | 1 |
| ΠΠΎΡΠ΅Π² Π½Π° Π²ΠΎΠ·Π±ΡΠ΄ΠΈΡΠ΅Π»Π΅ΠΉ ΠΊΠΈΡΠ΅ΡΠ½ΠΎΠΉ ΠΈΠ½ΡΠ΅ΠΊΡΠΈΠΈ (ΡΠ°Π»ΡΠΌΠΎΠ½Π΅Π»Π»Ρ, ΡΠΈΠ³Π΅Π»Π»Ρ) Ρ ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½ΠΈΠ΅ΠΌ ΡΡΠ²ΡΡΠ²ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΊ ΠΎΡΠ½ΠΎΠ²Π½ΠΎΠΌΡ ΡΠΏΠ΅ΠΊΡΡΡ Π°Π½ΡΠΈΠ±ΠΈΠΎΡΠΈΠΊΠΎΠ² | ΠΠΎΡΠ΅Π² ΠΊΠ°Π»Π° Π½Π° ΠΏΠ°ΡΠΎΠ³Π΅Π½Π½ΡΡ ΡΠ»ΠΎΡΡ (Π΄ΠΈΠ·Π΅Π½ΡΠ΅ΡΠΈΠΉΠ½Π°Ρ ΠΈ ΡΠΈΡΠΎΠΏΠ°ΡΠ°ΡΠΈΡΠΎΠ·Π½Π°Ρ Π³ΡΡΠΏΠΏΡ): Π‘ ΠΎΠΏΡΠ΅Π΄Π΅Π»Π΅Π½ΠΈΠ΅ΠΌ ΡΡΠ²ΡΡΠ²ΠΈΡΠ΅Π»ΡΠ½ΠΎΡΡΠΈ ΠΊ Π°Π½ΡΠΈΠ±ΠΈΠΎΡΠΈΠΊΠ°ΠΌ. ΠΠ΅ΡΠΎΠ΄: ΠΊΡΠ»ΡΡΡΡΠ°Π»ΡΠ½ΡΠΉ | 1 |
* Loss: [CrossEntropyLoss](https://sbert.net/docs/package_reference/cross_encoder/losses.html#crossentropyloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `num_train_epochs`: 1
#### All Hyperparameters