rya23's picture
Add new SentenceTransformer model
5556e43 verified
metadata
language:
  - en
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:6300
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: nomic-ai/modernbert-embed-base
widget:
  - source_sentence: >-
      How much is the company's obligations for non-cancellable operating leases
      for minimum rent payments throughout the future fiscal years as of January
      28, 2024?
    sentences:
      - >-
        While we do not expect to repatriate cash to the U.S. to satisfy
        domestic liquidity needs, if these amounts were distributed to the U.S.,
        in the form of dividends or otherwise, we may be subject to additional
        foreign withholding taxes and U.S. state income taxes, which could be
        material.
      - Operating leases (minimum rent) | $ | 1,645,318 |
      - >-
        the top three exposures being to issuers and counterparties domiciled in
        France at $5.1 billion, the United Kingdom at $4.8 billion, and Canada
        at $1.7 billion.
  - source_sentence: What does the Selected Drug list published by CMS include?
    sentences:
      - >-
        Trodelvy product sales were $680 million in 2022 and increased by 56% to
        $1.1 billion in 2023.
      - >-
        Medicaid Services (“CMS”) published the first “Selected Drug” list,
        which includes XARELTO and STELARA as well as IMBRUVICA.
      - >-
        Cost of revenues decreased by 9%, or $123.9 million, in the year ended
        December 31, 2023, as compared to the same period in 2022.
  - source_sentence: >-
      What was the dividend per share paid in 2023 and how did it change from
      the previous year?
    sentences:
      - >-
        Dividends of $4.52 per share and $3.92 per share were paid in 2023 and
        2022, respectively.
      - >-
        We design, develop, manufacture, sell and lease high-performance fully
        electric vehicles and energy generation and storage systems, and offer
        services related to our products.
      - >-
        As of December 31, 2023, AMC operated 217 IMAX screens and 169 Dolby
        Cinema screens, according to the large screen format detailing from the
        data provided.
  - source_sentence: What generation technology does the 40 Series graphics cards feature?
    sentences:
      - >-
        We believe that a diverse, equitable, and inclusive workplace is a
        strategic business imperative and we take a comprehensive view of
        diversity, equity, and inclusion. We conduct annual pay equity analyses
        and support many employee-led inclusion networks.
      - >-
        HP records revenue from the sale of equipment under sales-type leases as
        revenue at the commencement of the lease. This method is applied unless
        certain conditions such as customer acceptance remain uncertain or
        significant obligations to the customer remain unfulfilled.
      - >-
        The 40 Series features our third generation RTX technology, third
        generation NVIDIA DLSS, and fourth generation Tensor Cores to deliver up
        to 4X the performance of the previous generation.
  - source_sentence: >-
      What typical reimbursement methods are used in the company's contracts
      with hospitals for inpatient and outpatient services?
    sentences:
      - >-
        We typically contract with hospitals on either (1) a per diem rate,
        which is an all-inclusive rate per day, (2) a case rate for
        diagnosis-related groups (DRG), which is an all-inclusive rate per
        admission, or (3) a discounted charge for inpatient hospital services.
        Outpatient hospital services generally are contracted at a flat rate by
        type of service, ambulatory payment classifications, or APCs, or at a
        discounted charge.
      - >-
        In IBM’s 2023 Annual Report to Stockholders, the Financial Statements
        and Supplementary Data are detailed on pages 44 through 121.
      - >-
        Certain joint venture agreements in China allow for the contractual
        right to report vehicle sales of non-GM trademarked vehicles by those
        joint ventures, which are included in the total vehicle sales General
        Motors reports for China.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: ModernBERT Embed base Finance 10k Matryoshka
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.7244285714285714
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8554285714285714
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8902857142857142
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9271428571428572
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7244285714285714
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.28514285714285714
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.17805714285714286
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09271428571428571
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7244285714285714
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8554285714285714
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8902857142857142
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9271428571428572
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8286211304635027
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7967674603174593
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7999047933786212
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.7238571428571429
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8532857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8874285714285715
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.927
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7238571428571429
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2844285714285714
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1774857142857143
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09269999999999999
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7238571428571429
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8532857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8874285714285715
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.927
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8272739780211489
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7951572562358264
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7982776212491204
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.7231428571428572
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8512857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.886
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.924
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7231428571428572
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.28376190476190477
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.17720000000000002
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0924
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7231428571428572
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8512857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.886
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.924
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.825556207455572
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7937802721088419
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.797126360319449
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.7017142857142857
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8362857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8725714285714286
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9165714285714286
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7017142857142857
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.27876190476190477
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1745142857142857
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09165714285714284
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7017142857142857
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8362857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8725714285714286
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9165714285714286
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8107684115571783
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7767397392290242
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7803112683678572
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.6702857142857143
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.8052857142857143
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8491428571428571
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8958571428571429
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.6702857142857143
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2684285714285714
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1698285714285714
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08958571428571427
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.6702857142857143
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.8052857142857143
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8491428571428571
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8958571428571429
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7834046997868768
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.7473670068027197
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.7515757256525822
            name: Cosine Map@100

ModernBERT Embed base Finance 10k Matryoshka

This is a sentence-transformers model finetuned from nomic-ai/modernbert-embed-base on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/modernbert-embed-base
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • json
  • Language: en

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("rya23/modernbert-embed-finance-matryoshka")
# Run inference
sentences = [
    "What typical reimbursement methods are used in the company's contracts with hospitals for inpatient and outpatient services?",
    'We typically contract with hospitals on either (1) a per diem rate, which is an all-inclusive rate per day, (2) a case rate for diagnosis-related groups (DRG), which is an all-inclusive rate per admission, or (3) a discounted charge for inpatient hospital services. Outpatient hospital services generally are contracted at a flat rate by type of service, ambulatory payment classifications, or APCs, or at a discounted charge.',
    'In IBM’s 2023 Annual Report to Stockholders, the Financial Statements and Supplementary Data are detailed on pages 44 through 121.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.6756, 0.0659],
#         [0.6756, 1.0000, 0.0087],
#         [0.0659, 0.0087, 1.0000]])

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.7244
cosine_accuracy@3 0.8554
cosine_accuracy@5 0.8903
cosine_accuracy@10 0.9271
cosine_precision@1 0.7244
cosine_precision@3 0.2851
cosine_precision@5 0.1781
cosine_precision@10 0.0927
cosine_recall@1 0.7244
cosine_recall@3 0.8554
cosine_recall@5 0.8903
cosine_recall@10 0.9271
cosine_ndcg@10 0.8286
cosine_mrr@10 0.7968
cosine_map@100 0.7999

Information Retrieval

Metric Value
cosine_accuracy@1 0.7239
cosine_accuracy@3 0.8533
cosine_accuracy@5 0.8874
cosine_accuracy@10 0.927
cosine_precision@1 0.7239
cosine_precision@3 0.2844
cosine_precision@5 0.1775
cosine_precision@10 0.0927
cosine_recall@1 0.7239
cosine_recall@3 0.8533
cosine_recall@5 0.8874
cosine_recall@10 0.927
cosine_ndcg@10 0.8273
cosine_mrr@10 0.7952
cosine_map@100 0.7983

Information Retrieval

Metric Value
cosine_accuracy@1 0.7231
cosine_accuracy@3 0.8513
cosine_accuracy@5 0.886
cosine_accuracy@10 0.924
cosine_precision@1 0.7231
cosine_precision@3 0.2838
cosine_precision@5 0.1772
cosine_precision@10 0.0924
cosine_recall@1 0.7231
cosine_recall@3 0.8513
cosine_recall@5 0.886
cosine_recall@10 0.924
cosine_ndcg@10 0.8256
cosine_mrr@10 0.7938
cosine_map@100 0.7971

Information Retrieval

Metric Value
cosine_accuracy@1 0.7017
cosine_accuracy@3 0.8363
cosine_accuracy@5 0.8726
cosine_accuracy@10 0.9166
cosine_precision@1 0.7017
cosine_precision@3 0.2788
cosine_precision@5 0.1745
cosine_precision@10 0.0917
cosine_recall@1 0.7017
cosine_recall@3 0.8363
cosine_recall@5 0.8726
cosine_recall@10 0.9166
cosine_ndcg@10 0.8108
cosine_mrr@10 0.7767
cosine_map@100 0.7803

Information Retrieval

Metric Value
cosine_accuracy@1 0.6703
cosine_accuracy@3 0.8053
cosine_accuracy@5 0.8491
cosine_accuracy@10 0.8959
cosine_precision@1 0.6703
cosine_precision@3 0.2684
cosine_precision@5 0.1698
cosine_precision@10 0.0896
cosine_recall@1 0.6703
cosine_recall@3 0.8053
cosine_recall@5 0.8491
cosine_recall@10 0.8959
cosine_ndcg@10 0.7834
cosine_mrr@10 0.7474
cosine_map@100 0.7516

Training Details

Training Dataset

json

  • Dataset: json
  • Size: 6,300 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 9 tokens
    • mean: 20.55 tokens
    • max: 43 tokens
    • min: 5 tokens
    • mean: 46.3 tokens
    • max: 243 tokens
  • Samples:
    anchor positive
    How many shares of class A common stock were authorized for grant under Visa's Equity Incentive Compensation Plan? Under the Company’s 2007 Amended and Restated Equity Incentive Compensation Plan (EIP), the compensation committee of the board of directors was authorized to grant up to 198 million shares of class A common stock to its employees and non-employee directors.
    What was Garmin Ltd.'s net income for the fiscal year ended December 30, 2023? Garmin Ltd. reported a net income of $1,289,636 for the fiscal year ended December 30, 2023.
    Why are some device sales revenue at AT&T not immediately recognized upon the device sale? AT&T recognizes revenue from device sales with promotions or installment payments differently. For promotional discounts, revenue is deferred and amortized over the contract term. Meanwhile, installment sales involve recognizing revenue upfront but deferring the cash receipt until payments are made, resulting in a recorded contract asset to be amortized over time.
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 48
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • warmup_steps: 0.1
  • fp16: True
  • load_best_model_at_end: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 48
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: None
  • warmup_ratio: 0.1
  • warmup_steps: 0.1
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • enable_jit_checkpoint: False
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • use_cpu: False
  • seed: 42
  • data_seed: None
  • bf16: False
  • fp16: True
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: -1
  • ddp_backend: None
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • auto_find_batch_size: False
  • full_determinism: False
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • use_cache: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.6091 10 0.3092 - - - - -
1.0 17 - 0.8155 0.8138 0.8104 0.7948 0.7647
1.1827 20 0.0958 - - - - -
1.7919 30 0.0675 - - - - -
2.0 34 - 0.8257 0.8245 0.8219 0.8045 0.7757
2.3655 40 0.0458 - - - - -
2.9746 50 0.0505 - - - - -
3.0 51 - 0.8277 0.8259 0.8243 0.8087 0.7819
3.5482 60 0.0593 - - - - -
4.0 68 - 0.8286 0.8273 0.8256 0.8108 0.7834
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.2.2
  • Transformers: 5.0.0
  • PyTorch: 2.9.0+cu128
  • Accelerate: 1.12.0
  • Datasets: 4.0.0
  • Tokenizers: 0.22.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}