model-b-structured / README.md
radoslavralev's picture
Add new SentenceTransformer model
65c2b38 verified
|
raw
history blame
40.1 kB
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:111470
  - loss:MultipleNegativesRankingLoss
base_model: sentence-transformers/all-MiniLM-L6-v2
widget:
  - source_sentence: why are some rocks radioactive
    sentences:
      - >-
        Radioactive accessory minerals such as zircon may contribute to the
        radioactivity of a mineral which is otherwise non-radioactive by
        calculation. Many granites or other igneous rocks contain some
        radioactivity because of minor, but highly radioactive, accessory
        minerals.re = mineral density (S Atomic number / Molecular Weight) where
        re is the electron density in grams/cc.efinition. Radioactivity in
        minerals are caused by the inclusion of naturally-occurring radioactive
        elements in the mineral's composition. The degree of radioactivity is
        dependent on the concentration and isotope present in the mineral.
      - >-
        Taking B-complex vitamins, which include vitamin B12, can cause urine to
        have a bright yellow or even orange color, but check with your doctor to
        be sure that's what is going on in your case. B vitamins are
        water-soluble vitamins, which means that what your body doesn't use is
        excreted in your urine. Riboflavin (vitamin B2) is especially likely to
        cause this color change in urine. Several medications can also turn
        urine a bright yellow or orange color. Changes in urine color may also
        signal certain health problems.
      - >-
        Radioactive material is just another name for a group of unstable atoms
        that emit ionizing radiation. These groups of unstable atoms emit
        radiation because they try to become stable. Radioactive materials emit
        radiation in a process called radioactive decay.
  - source_sentence: How was your experience of Lucid dreaming at home?
    sentences:
      - How was your experience of Lucid dreaming at home?
      - How was your experience of Lucid dreaming outside the home?
      - "Bournemouth /Ë\x88bÉ\x94É\x99rnmÉ\x99θ/ is a large coastal resort town on the south coast of England directly to the east of the Jurassic Coast, a 96-mile (155 km) World Heritage Site. According to the 2011 census, the town has a population of 183,491 making it the largest settlement in Dorset.he Bournemouth Eye is a helium-filled balloon attached to a steel cable in the town's lower gardens. The spherical balloon is 69 m (226 ft) in circumference and carries an enclosed, steel gondola. Rising to a height of 150 m (492 ft), it provides a panoramic view of the surrounding area for up to 28 passengers."
  - source_sentence: what is iraq's dominant religion
    sentences:
      - >-
        If you are working, consider taking maternity leave as early as you can.
        This makes sense anyway because carrying twins is hard work, and most
        twins arrive earlier than single babies (NCCWCH 2011: 128) . More than
        half of twins arrive early, before 37 weeks (NCCWCH 2011: 120, Tamba
        2012) .Talk to your midwife or doctor if you are feeling down about your
        pregnancy (NICE 2011) .f you are working, consider taking maternity
        leave as early as you can. This makes sense anyway because carrying
        twins is hard work, and most twins arrive earlier than single babies
        (NCCWCH 2011: 128) . More than half of twins arrive early, before 37
        weeks (NCCWCH 2011: 120, Tamba 2012) .
      - "Introduction. Although Iranâ\x80\x99s state religion is Shiite Islam and the majority of its population is ethnically Persian, millions of minorities from various ethnic, religious, and linguistic backgrounds also reside in Iran. Among these groups are ethnic Kurds, Baluchis, and Azeris.lthough Iranâ\x80\x99s state religion is Shiite Islam and the majority of its population is ethnically Persian, millions of minorities from various ethnic, religious, and linguistic backgrounds also reside in Iran."
      - >-
        In today's Republic of Iraq, where Islam is the state religion and
        claims the beliefs of 95 percent of the population, the majority of
        Iraqis identify with Arab culture. The second-largest cultural group is
        the Kurds, who are in the highlands and mountain valleys of the north in
        a politically autonomous settlement.
  - source_sentence: how many years of education are needed to become a pediatric nurse
    sentences:
      - >-
        In terms of educational background, pediatric nurse requirements include
        either an Associate's or a Bachelor's degree in Nursing. An Associate's
        degree (ADN) typically takes two years to complete, while a Bachelor's
        degree (BSN) typically takes four years. ADN programs are usually
        offered by community colleges.
      - "Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA, United States Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA, United States Living area with king bed by Monique' M. â\x80\x9CAnd there's a complimentary reception with 2 drinks, soup and salad bar nightly.â\x80\x9D in 2 reviews"
      - >-
        From there, additional training specific to the care of children is
        required. Pediatric nurses can become certified in the field and may
        choose to further specialize in a particular area. Program Levels:
        Associate's degree, bachelor's degree.
  - source_sentence: >-
      Schliemann recognized five shafts and cleared them like the graves
      mentioned by Pausanias .
    sentences:
      - >-
        IBM banned the usage of the POWER5+ in its System p5 510Q, 520Q, 550Q
        and 560Q servers.
      - >-
        Schliemann cleared five shafts and recognized them as the graves
        mentioned by Pausania .
      - >-
        Schliemann recognized five shafts and cleared them like the graves
        mentioned by Pausanias .
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoMSMARCO
          type: NanoMSMARCO
        metrics:
          - type: cosine_accuracy@1
            value: 0.32
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.5
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.56
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.7
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.32
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.16666666666666669
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.11200000000000002
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.07
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.32
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.5
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.56
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.7
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.4962486706422321
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.43346031746031743
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.44415856354878636
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: NanoNQ
          type: NanoNQ
        metrics:
          - type: cosine_accuracy@1
            value: 0.16
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.26
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.32
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.46
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.16
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.08666666666666666
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.068
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.04800000000000001
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.15
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.23
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.3
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.43
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.27247558178705156
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.23207936507936502
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.234397839045408
            name: Cosine Map@100
      - task:
          type: nano-beir
          name: Nano BEIR
        dataset:
          name: NanoBEIR mean
          type: NanoBEIR_mean
        metrics:
          - type: cosine_accuracy@1
            value: 0.24
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.38
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.44000000000000006
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.58
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.24
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.12666666666666668
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.09000000000000001
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.05900000000000001
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.235
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.365
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.43000000000000005
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.565
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.38436212621464183
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.3327698412698412
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.3392782012970972
            name: Cosine Map@100

SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("redis/model-b-structured")
# Run inference
sentences = [
    'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .',
    'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .',
    'Schliemann cleared five shafts and recognized them as the graves mentioned by Pausania .',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 1.0000, 0.9955],
#         [1.0000, 1.0000, 0.9955],
#         [0.9955, 0.9955, 1.0000]])

Evaluation

Metrics

Information Retrieval

Metric NanoMSMARCO NanoNQ
cosine_accuracy@1 0.32 0.16
cosine_accuracy@3 0.5 0.26
cosine_accuracy@5 0.56 0.32
cosine_accuracy@10 0.7 0.46
cosine_precision@1 0.32 0.16
cosine_precision@3 0.1667 0.0867
cosine_precision@5 0.112 0.068
cosine_precision@10 0.07 0.048
cosine_recall@1 0.32 0.15
cosine_recall@3 0.5 0.23
cosine_recall@5 0.56 0.3
cosine_recall@10 0.7 0.43
cosine_ndcg@10 0.4962 0.2725
cosine_mrr@10 0.4335 0.2321
cosine_map@100 0.4442 0.2344

Nano BEIR

  • Dataset: NanoBEIR_mean
  • Evaluated with NanoBEIREvaluator with these parameters:
    {
        "dataset_names": [
            "msmarco",
            "nq"
        ],
        "dataset_id": "lightonai/NanoBEIR-en"
    }
    
Metric Value
cosine_accuracy@1 0.24
cosine_accuracy@3 0.38
cosine_accuracy@5 0.44
cosine_accuracy@10 0.58
cosine_precision@1 0.24
cosine_precision@3 0.1267
cosine_precision@5 0.09
cosine_precision@10 0.059
cosine_recall@1 0.235
cosine_recall@3 0.365
cosine_recall@5 0.43
cosine_recall@10 0.565
cosine_ndcg@10 0.3844
cosine_mrr@10 0.3328
cosine_map@100 0.3393

Training Details

Training Dataset

Unnamed Dataset

  • Size: 111,470 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 10.95 tokens
    • max: 60 tokens
    • min: 6 tokens
    • mean: 67.57 tokens
    • max: 128 tokens
    • min: 7 tokens
    • mean: 66.64 tokens
    • max: 128 tokens
  • Samples:
    anchor positive negative
    how far is sandos caracol eco resort from cancun airport The Sandos Caracol Eco Resort is 2 miles from the Church of Guadalupe and a 45-minute drive from Cancun Cancún. Airport The Gran Coral Golf Riviera maya is located within the same estate as The. Sandos we speak your! Language Hotel: rooms, 680 Hotel: Chain Sandos & Hotels. resorts Featuring a spa, 8 restaurants and 2 outdoor pools, Sandos Caracol Eco Resort is set on Playa del Carmen Beach, overlooking Cozumel Island. Its rooms have balconies overlooking the Caribbean Sea. Sandos Caracol Eco Resort is in beautiful gardens and features bright accommodations.
    can eggs expire Here is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration.ere is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration. The answer to this question may surprise you: while uncooked eggs typically last four to five weeks when properly refrigerated, hard-boiled eggs will only last about a week. This is because egg shells, which are highly porous, are sprayed before sale with a thin coating of mineral oil that seals the egg.
    how old are first graders? First Grade Worksheets Online. 6 and 7 year old kids get their first taste of real schooling in first grade. Help children learn the basics in math, reading, language and science with our printable first grade worksheets. Spelling Worksheets for 1st Grade. Average BMI percentile-for-age values were 59.5 (28.8) for first-graders, 59.5 (30.5) for third-graders, and 62.4 (31.7) for fifth-graders. The number of participants classified as obese was 144 (25.6% of first-graders, 28.5% of third-graders, and 34.5% of fifth-graders). The percentage of students who reported a reasonable height or weight ranged from 20% (first grade, height) to 92% (fifth grade, weight) (Table). In general, self-report ability was better in older children and when self-reporting weight.
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 3.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 12,386 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 11.11 tokens
    • max: 66 tokens
    • min: 7 tokens
    • mean: 67.99 tokens
    • max: 128 tokens
    • min: 7 tokens
    • mean: 66.08 tokens
    • max: 128 tokens
  • Samples:
    anchor positive negative
    In 1883 , the first schools were built in the vicinity for 400 white and 60 black students . In 1883 , the first schools were built in the vicinity for 400 white and 60 black students . In 1883 , the first schools in the area were built for 400 black students and 60 white students .
    what is the origin of the name haja Haja is a Muslim baby Girl name, it is an Arabic originated name. Haja name meaning is In the heart condition through and the lucky number associated with Haja is 5. Find all the relevant details about the Haja Meaning, Origin, Lucky Number and Religion from this page. Average rating of Haja is 1 stars, based on 0 reviews. Synonomis with the exclamation commonly used in urban circles Holla. Haba is derived from the term, Holla Bitches, which became Haba Litches, which eventually evolved to Habalicious, and finally became just Haba. When seeing a fine female passing by, Russell exclaimed, Haba.
    what causes itch rash A rash is a noticeable change in the texture or color of the skin. The skin may become itchy, bumpy, chapped, scaly, or otherwise irritated. Rashes are caused by a wide range of conditions, including allergies, medication, cosmetics, and various diseases. The rash is often reddish and itchy, with a scaly texture. 2 bug bites: tick bites are of particular concern, as they can transmit disease. 3 psoriasis: a scaly, itchy, red rash that forms along the scalp and joints. 4 dandruff: an itchy, flaky rash on the scalp. Causes of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes).3 Knee pain (122 causes). 4 Knee tingling (6 causes). 5 Knee symptoms (149 causes). 6 Skin itch (1068 causes). 7 Skin rash (461 causes). 8 Insect bite.auses of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes).
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 3.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 0.0001
  • weight_decay: 0.001
  • max_steps: 5062
  • warmup_ratio: 0.1
  • fp16: True
  • dataloader_drop_last: True
  • dataloader_num_workers: 1
  • dataloader_prefetch_factor: 1
  • load_best_model_at_end: True
  • optim: adamw_torch
  • ddp_find_unused_parameters: False
  • push_to_hub: True
  • hub_model_id: redis/model-b-structured
  • eval_on_start: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 0.0001
  • weight_decay: 0.001
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3.0
  • max_steps: 5062
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 1
  • dataloader_prefetch_factor: 1
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: False
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: True
  • resume_from_checkpoint: None
  • hub_model_id: redis/model-b-structured
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: True
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss Validation Loss NanoMSMARCO_cosine_ndcg@10 NanoNQ_cosine_ndcg@10 NanoBEIR_mean_cosine_ndcg@10
0 0 - 3.3212 0.5540 0.5931 0.5735
0.2874 250 3.2509 3.0429 0.4590 0.4189 0.4389
0.5747 500 3.1458 3.0222 0.4855 0.3752 0.4303
0.8621 750 3.1119 3.0053 0.4708 0.3715 0.4211
1.1494 1000 3.0646 2.9901 0.4632 0.3600 0.4116
1.4368 1250 3.0381 2.9852 0.5014 0.3426 0.4220
1.7241 1500 3.0301 2.9781 0.4967 0.3029 0.3998
2.0115 1750 3.0238 2.9768 0.4706 0.2717 0.3712
2.2989 2000 2.9739 2.9735 0.4828 0.2734 0.3781
2.5862 2250 2.9709 2.9696 0.4896 0.2257 0.3576
2.8736 2500 2.9652 2.9693 0.4816 0.2553 0.3684
3.1609 2750 2.9475 2.9720 0.4815 0.2618 0.3717
3.4483 3000 2.9313 2.9715 0.5048 0.2831 0.3939
3.7356 3250 2.9309 2.9705 0.4606 0.2879 0.3743
4.0230 3500 2.9264 2.9712 0.5049 0.2774 0.3911
4.3103 3750 2.9056 2.9722 0.4758 0.2532 0.3645
4.5977 4000 2.9056 2.9708 0.5004 0.2724 0.3864
4.8851 4250 2.9038 2.9705 0.5066 0.2675 0.3870
5.1724 4500 2.8932 2.9729 0.4890 0.2627 0.3759
5.4598 4750 2.8884 2.9710 0.5016 0.2822 0.3919
5.7471 5000 2.8876 2.9712 0.4962 0.2725 0.3844

Framework Versions

  • Python: 3.10.18
  • Sentence Transformers: 5.2.0
  • Transformers: 4.57.3
  • PyTorch: 2.9.1+cu128
  • Accelerate: 1.12.0
  • Datasets: 2.21.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}