all-MiniLM-L6-v17-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v13_synonyms_added dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'kite',
    'ramadan kaftan clutch',
    'side pocket boardshorts',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.6999, 0.6212],
#         [0.6999, 1.0000, 0.7124],
#         [0.6212, 0.7124, 1.0000]])

Training Details

Training Dataset

pairs_three_scores_v13_synonyms_added

  • Dataset: pairs_three_scores_v13_synonyms_added at 10e49f8
  • Size: 9,229,520 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.65 tokens
    • max: 18 tokens
    • min: 3 tokens
    • mean: 5.69 tokens
    • max: 16 tokens
    • min: 0.15
    • mean: 0.42
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    kettlebell bag 0.22
    mixed berry milk shake elasticized waistband shorts 0.21
    raw linden honey refresher sponge 0.22
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v13_synonyms_added

  • Dataset: pairs_three_scores_v13_synonyms_added at 10e49f8
  • Size: 46,380 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.69 tokens
    • max: 115 tokens
    • min: 3 tokens
    • mean: 5.77 tokens
    • max: 115 tokens
    • min: 0.15
    • mean: 0.42
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    bag nude rocks 0.24
    semi natural necklace 21 kt plated necklace 1.0
    eco friendly coasters measuring cup 0.23
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss
0.0014 100 11.7561 -
0.0028 200 11.739 -
0.0042 300 11.2175 -
0.0055 400 11.0759 -
0.0069 500 10.7749 10.9497
0.0083 600 10.4026 -
0.0097 700 10.2194 -
0.0111 800 9.834 -
0.0125 900 9.6126 -
0.0139 1000 9.3563 9.3834
0.0153 1100 9.0716 -
0.0166 1200 8.9245 -
0.0180 1300 8.7384 -
0.0194 1400 8.6381 -
0.0208 1500 8.6089 8.5228
0.0222 1600 8.5817 -
0.0236 1700 8.5418 -
0.0250 1800 8.532 -
0.0264 1900 8.5107 -
0.0277 2000 8.4917 8.4366
0.0291 2100 8.485 -
0.0305 2200 8.4826 -
0.0319 2300 8.4512 -
0.0333 2400 8.4694 -
0.0347 2500 8.4485 8.3778
0.0361 2600 8.4293 -
0.0374 2700 8.4222 -
0.0388 2800 8.4031 -
0.0402 2900 8.3947 -
0.0416 3000 8.3912 8.3335
0.0430 3100 8.3913 -
0.0444 3200 8.3822 -
0.0458 3300 8.3552 -
0.0472 3400 8.3759 -
0.0485 3500 8.3632 8.2942
0.0499 3600 8.3495 -
0.0513 3700 8.3385 -
0.0527 3800 8.3346 -
0.0541 3900 8.3249 -
0.0555 4000 8.3033 8.2534
0.0569 4100 8.3141 -
0.0582 4200 8.3015 -
0.0596 4300 8.2982 -
0.0610 4400 8.3006 -
0.0624 4500 8.2972 8.2175
0.0638 4600 8.2757 -
0.0652 4700 8.2765 -
0.0666 4800 8.2668 -
0.0680 4900 8.2472 -
0.0693 5000 8.2605 8.1990
0.0707 5100 8.2481 -
0.0721 5200 8.2598 -
0.0735 5300 8.2403 -
0.0749 5400 8.2388 -
0.0763 5500 8.2074 8.1497
0.0777 5600 8.2236 -
0.0791 5700 8.2204 -
0.0804 5800 8.2086 -
0.0818 5900 8.208 -
0.0832 6000 8.1991 8.1357
0.0846 6100 8.2064 -
0.0860 6200 8.1969 -
0.0874 6300 8.1795 -
0.0888 6400 8.1846 -
0.0901 6500 8.188 8.1128
0.0915 6600 8.1902 -
0.0929 6700 8.1624 -
0.0943 6800 8.1527 -
0.0957 6900 8.1589 -
0.0971 7000 8.1624 8.0843
0.0985 7100 8.1705 -
0.0999 7200 8.1362 -
0.1012 7300 8.1419 -
0.1026 7400 8.1564 -
0.1040 7500 8.1422 8.0581
0.1054 7600 8.1214 -
0.1068 7700 8.1369 -
0.1082 7800 8.1024 -
0.1096 7900 8.0974 -
0.1109 8000 8.1316 8.0378
0.1123 8100 8.1185 -
0.1137 8200 8.1148 -
0.1151 8300 8.1015 -
0.1165 8400 8.0851 -
0.1179 8500 8.0881 8.0091
0.1193 8600 8.0734 -
0.1207 8700 8.0644 -
0.1220 8800 8.0802 -
0.1234 8900 8.0827 -
0.1248 9000 8.0934 8.0049
0.1262 9100 8.0544 -
0.1276 9200 8.0828 -
0.1290 9300 8.0844 -
0.1304 9400 8.0598 -
0.1318 9500 8.0575 7.9784
0.1331 9600 8.0476 -
0.1345 9700 8.0617 -
0.1359 9800 8.0632 -
0.1373 9900 8.0398 -
0.1387 10000 8.0455 7.9625
0.1401 10100 8.0441 -
0.1415 10200 8.0462 -
0.1428 10300 8.0429 -
0.1442 10400 8.0332 -
0.1456 10500 8.0087 7.9579
0.1470 10600 8.0374 -
0.1484 10700 8.0243 -
0.1498 10800 8.0445 -
0.1512 10900 8.0155 -
0.1526 11000 8.0161 7.9321
0.1539 11100 8.0092 -
0.1553 11200 8.0041 -
0.1567 11300 8.0165 -
0.1581 11400 8.005 -
0.1595 11500 7.9992 7.9243
0.1609 11600 8.0109 -
0.1623 11700 8.0096 -
0.1636 11800 8.0176 -
0.1650 11900 7.9965 -
0.1664 12000 8.0159 7.9092
0.1678 12100 7.9865 -
0.1692 12200 7.9742 -
0.1706 12300 7.9757 -
0.1720 12400 7.9852 -
0.1734 12500 8.0068 7.8931
0.1747 12600 7.9616 -
0.1761 12700 7.9889 -
0.1775 12800 7.9795 -
0.1789 12900 7.9657 -
0.1803 13000 7.952 7.8785
0.1817 13100 7.9534 -
0.1831 13200 7.9212 -
0.1845 13300 7.9479 -
0.1858 13400 7.9433 -
0.1872 13500 7.9599 7.8757
0.1886 13600 7.9751 -
0.1900 13700 7.9564 -
0.1914 13800 7.9642 -
0.1928 13900 7.9511 -
0.1942 14000 7.9458 7.8580
0.1955 14100 7.9625 -
0.1969 14200 7.9728 -
0.1983 14300 7.9235 -
0.1997 14400 7.9658 -
0.2011 14500 7.9567 7.8480
0.2025 14600 7.9214 -
0.2039 14700 7.8983 -
0.2053 14800 7.9334 -
0.2066 14900 7.9345 -
0.2080 15000 7.9245 7.8334
0.2094 15100 7.9144 -
0.2108 15200 7.9375 -
0.2122 15300 7.9058 -
0.2136 15400 7.9365 -
0.2150 15500 7.9101 7.8291
0.2163 15600 7.9001 -
0.2177 15700 7.8906 -
0.2191 15800 7.9103 -
0.2205 15900 7.8899 -
0.2219 16000 7.8874 7.8157
0.2233 16100 7.9011 -
0.2247 16200 7.92 -
0.2261 16300 7.8933 -
0.2274 16400 7.886 -
0.2288 16500 7.8959 7.8097
0.2302 16600 7.8623 -
0.2316 16700 7.8703 -
0.2330 16800 7.8934 -
0.2344 16900 7.8651 -
0.2358 17000 7.91 7.8047
0.2372 17100 7.8794 -
0.2385 17200 7.8794 -
0.2399 17300 7.8723 -
0.2413 17400 7.9007 -
0.2427 17500 7.8568 7.7977
0.2441 17600 7.8855 -
0.2455 17700 7.8687 -
0.2469 17800 7.8708 -
0.2482 17900 7.8533 -
0.2496 18000 7.87 7.8019
0.2510 18100 7.8364 -
0.2524 18200 7.8901 -
0.2538 18300 7.8782 -
0.2552 18400 7.8817 -
0.2566 18500 7.8794 7.7774
0.2580 18600 7.9031 -
0.2593 18700 7.8897 -
0.2607 18800 7.8741 -
0.2621 18900 7.8774 -
0.2635 19000 7.8771 7.7696
0.2649 19100 7.839 -
0.2663 19200 7.8786 -
0.2677 19300 7.866 -
0.2690 19400 7.868 -
0.2704 19500 7.8804 7.7672
0.2718 19600 7.8398 -
0.2732 19700 7.8662 -
0.2746 19800 7.8341 -
0.2760 19900 7.86 -
0.2774 20000 7.8325 7.7518
0.2788 20100 7.7957 -
0.2801 20200 7.8478 -
0.2815 20300 7.8601 -
0.2829 20400 7.8395 -
0.2843 20500 7.8414 7.7452
0.2857 20600 7.8332 -
0.2871 20700 7.862 -
0.2885 20800 7.8007 -
0.2899 20900 7.8249 -
0.2912 21000 7.8237 7.7456
0.2926 21100 7.8616 -
0.2940 21200 7.865 -
0.2954 21300 7.8226 -
0.2968 21400 7.8245 -
0.2982 21500 7.809 7.7333
0.2996 21600 7.8026 -
0.3009 21700 7.8169 -
0.3023 21800 7.8201 -
0.3037 21900 7.8057 -
0.3051 22000 7.8237 7.7258
0.3065 22100 7.82 -
0.3079 22200 7.819 -
0.3093 22300 7.7972 -
0.3107 22400 7.8141 -
0.3120 22500 7.8135 7.7245
0.3134 22600 7.7951 -
0.3148 22700 7.8051 -
0.3162 22800 7.7968 -
0.3176 22900 7.8256 -
0.3190 23000 7.8407 7.7135
0.3204 23100 7.8241 -
0.3217 23200 7.8195 -
0.3231 23300 7.7964 -
0.3245 23400 7.8166 -
0.3259 23500 7.821 7.6989
0.3273 23600 7.8125 -
0.3287 23700 7.7913 -
0.3301 23800 7.7958 -
0.3315 23900 7.7988 -
0.3328 24000 7.8148 7.7022
0.3342 24100 7.7964 -
0.3356 24200 7.7924 -
0.3370 24300 7.7783 -
0.3384 24400 7.8008 -
0.3398 24500 7.7745 7.6911
0.3412 24600 7.8002 -
0.3426 24700 7.7984 -
0.3439 24800 7.8212 -
0.3453 24900 7.7789 -
0.3467 25000 7.7609 7.6880
0.3481 25100 7.792 -
0.3495 25200 7.8064 -
0.3509 25300 7.7851 -
0.3523 25400 7.784 -
0.3536 25500 7.7905 7.6772
0.3550 25600 7.8252 -
0.3564 25700 7.766 -
0.3578 25800 7.7424 -
0.3592 25900 7.779 -
0.3606 26000 7.7701 7.6759
0.3620 26100 7.774 -
0.3634 26200 7.7752 -
0.3647 26300 7.7928 -
0.3661 26400 7.7525 -
0.3675 26500 7.7783 7.6744
0.3689 26600 7.7618 -
0.3703 26700 7.8067 -
0.3717 26800 7.7771 -
0.3731 26900 7.7936 -
0.3744 27000 7.7499 7.6710
0.3758 27100 7.7629 -
0.3772 27200 7.7843 -
0.3786 27300 7.7735 -
0.3800 27400 7.7662 -
0.3814 27500 7.7453 7.6658
0.3828 27600 7.7417 -
0.3842 27700 7.7793 -
0.3855 27800 7.7535 -
0.3869 27900 7.7695 -
0.3883 28000 7.758 7.6481
0.3897 28100 7.7391 -
0.3911 28200 7.7447 -
0.3925 28300 7.7691 -
0.3939 28400 7.7555 -
0.3953 28500 7.752 7.6460
0.3966 28600 7.7272 -
0.3980 28700 7.7464 -
0.3994 28800 7.7415 -
0.4008 28900 7.7616 -
0.4022 29000 7.7661 7.6477
0.4036 29100 7.7352 -
0.4050 29200 7.7438 -
0.4063 29300 7.7468 -
0.4077 29400 7.768 -
0.4091 29500 7.7581 7.6392
0.4105 29600 7.7374 -
0.4119 29700 7.7307 -
0.4133 29800 7.7292 -
0.4147 29900 7.7543 -
0.4161 30000 7.7435 7.6337
0.4174 30100 7.751 -
0.4188 30200 7.7264 -
0.4202 30300 7.7366 -
0.4216 30400 7.7137 -
0.4230 30500 7.7625 7.6239
0.4244 30600 7.7006 -
0.4258 30700 7.7571 -
0.4271 30800 7.722 -
0.4285 30900 7.7209 -
0.4299 31000 7.7159 7.6189
0.4313 31100 7.7058 -
0.4327 31200 7.7407 -
0.4341 31300 7.7093 -
0.4355 31400 7.7172 -
0.4369 31500 7.7532 7.6187
0.4382 31600 7.7254 -
0.4396 31700 7.716 -
0.4410 31800 7.7231 -
0.4424 31900 7.7272 -
0.4438 32000 7.7214 7.6153
0.4452 32100 7.7325 -
0.4466 32200 7.7268 -
0.4480 32300 7.6801 -
0.4493 32400 7.7209 -
0.4507 32500 7.6958 7.6057
0.4521 32600 7.6903 -
0.4535 32700 7.7379 -
0.4549 32800 7.7245 -
0.4563 32900 7.7506 -
0.4577 33000 7.7095 7.6051
0.4590 33100 7.7148 -
0.4604 33200 7.7182 -
0.4618 33300 7.7307 -
0.4632 33400 7.7381 -
0.4646 33500 7.7214 7.6028
0.4660 33600 7.6882 -
0.4674 33700 7.6864 -
0.4688 33800 7.6718 -
0.4701 33900 7.7201 -
0.4715 34000 7.7173 7.6092
0.4729 34100 7.6805 -
0.4743 34200 7.7264 -
0.4757 34300 7.7013 -
0.4771 34400 7.7074 -
0.4785 34500 7.7044 7.6044
0.4798 34600 7.742 -
0.4812 34700 7.7104 -
0.4826 34800 7.7004 -
0.4840 34900 7.7175 -
0.4854 35000 7.687 7.5947
0.4868 35100 7.7024 -
0.4882 35200 7.6666 -
0.4896 35300 7.6869 -
0.4909 35400 7.7147 -
0.4923 35500 7.7281 7.5804
0.4937 35600 7.6852 -
0.4951 35700 7.6735 -
0.4965 35800 7.7043 -
0.4979 35900 7.6884 -
0.4993 36000 7.7233 7.5851
0.5007 36100 7.6914 -
0.5020 36200 7.7083 -
0.5034 36300 7.6876 -
0.5048 36400 7.6909 -
0.5062 36500 7.679 7.5862
0.5076 36600 7.6884 -
0.5090 36700 7.6697 -
0.5104 36800 7.6625 -
0.5117 36900 7.6881 -
0.5131 37000 7.6859 7.5844
0.5145 37100 7.6624 -
0.5159 37200 7.6932 -
0.5173 37300 7.6851 -
0.5187 37400 7.6941 -
0.5201 37500 7.6473 7.5810
0.5215 37600 7.6619 -
0.5228 37700 7.6789 -
0.5242 37800 7.6842 -
0.5256 37900 7.6686 -
0.5270 38000 7.6677 7.5784
0.5284 38100 7.7113 -
0.5298 38200 7.6863 -
0.5312 38300 7.664 -
0.5325 38400 7.6928 -
0.5339 38500 7.685 7.5819
0.5353 38600 7.6507 -
0.5367 38700 7.6848 -
0.5381 38800 7.6435 -
0.5395 38900 7.6421 -
0.5409 39000 7.6883 7.5664
0.5423 39100 7.6907 -
0.5436 39200 7.6919 -
0.5450 39300 7.6956 -
0.5464 39400 7.6592 -
0.5478 39500 7.6488 7.5738
0.5492 39600 7.6918 -
0.5506 39700 7.6725 -
0.5520 39800 7.6804 -
0.5534 39900 7.6598 -
0.5547 40000 7.6888 7.5581
0.5561 40100 7.6732 -
0.5575 40200 7.7042 -
0.5589 40300 7.6626 -
0.5603 40400 7.7271 -
0.5617 40500 7.6753 7.5562
0.5631 40600 7.6521 -
0.5644 40700 7.667 -
0.5658 40800 7.6823 -
0.5672 40900 7.6635 -
0.5686 41000 7.6609 7.5553
0.5700 41100 7.6609 -
0.5714 41200 7.6712 -
0.5728 41300 7.6687 -
0.5742 41400 7.7182 -
0.5755 41500 7.6335 7.5660
0.5769 41600 7.6791 -
0.5783 41700 7.6509 -
0.5797 41800 7.6497 -
0.5811 41900 7.6514 -
0.5825 42000 7.6288 7.5552
0.5839 42100 7.6699 -
0.5852 42200 7.6824 -
0.5866 42300 7.68 -
0.5880 42400 7.661 -
0.5894 42500 7.6573 7.5487
0.5908 42600 7.6702 -
0.5922 42700 7.6573 -
0.5936 42800 7.6546 -
0.5950 42900 7.6424 -
0.5963 43000 7.6721 7.5504
0.5977 43100 7.6713 -
0.5991 43200 7.6695 -
0.6005 43300 7.6817 -
0.6019 43400 7.6484 -
0.6033 43500 7.6062 7.5481
0.6047 43600 7.6397 -
0.6061 43700 7.6555 -
0.6074 43800 7.6546 -
0.6088 43900 7.6781 -
0.6102 44000 7.6284 7.5399
0.6116 44100 7.666 -
0.6130 44200 7.6597 -
0.6144 44300 7.6651 -
0.6158 44400 7.6475 -
0.6171 44500 7.6565 7.5369
0.6185 44600 7.6336 -
0.6199 44700 7.6421 -
0.6213 44800 7.646 -
0.6227 44900 7.6319 -
0.6241 45000 7.664 7.5368
0.6255 45100 7.6515 -
0.6269 45200 7.6525 -
0.6282 45300 7.6534 -
0.6296 45400 7.655 -
0.6310 45500 7.6712 7.5278
0.6324 45600 7.6342 -
0.6338 45700 7.6077 -
0.6352 45800 7.6476 -
0.6366 45900 7.6412 -
0.6379 46000 7.6546 7.5331
0.6393 46100 7.6378 -
0.6407 46200 7.6572 -
0.6421 46300 7.6284 -
0.6435 46400 7.625 -
0.6449 46500 7.6526 7.5338
0.6463 46600 7.6172 -
0.6477 46700 7.6136 -
0.6490 46800 7.6428 -
0.6504 46900 7.6277 -
0.6518 47000 7.6903 7.5272
0.6532 47100 7.6313 -
0.6546 47200 7.6214 -
0.6560 47300 7.6044 -
0.6574 47400 7.6098 -
0.6588 47500 7.6477 7.5203
0.6601 47600 7.6454 -
0.6615 47700 7.6199 -
0.6629 47800 7.6119 -
0.6643 47900 7.6241 -
0.6657 48000 7.6414 7.5189
0.6671 48100 7.6629 -
0.6685 48200 7.6777 -
0.6698 48300 7.6217 -
0.6712 48400 7.6097 -
0.6726 48500 7.6449 7.5183
0.6740 48600 7.6131 -
0.6754 48700 7.622 -
0.6768 48800 7.6373 -
0.6782 48900 7.6193 -
0.6796 49000 7.6119 7.5209
0.6809 49100 7.6261 -
0.6823 49200 7.626 -
0.6837 49300 7.6232 -
0.6851 49400 7.5951 -
0.6865 49500 7.6368 7.5136
0.6879 49600 7.6641 -
0.6893 49700 7.6046 -
0.6906 49800 7.5923 -
0.6920 49900 7.6119 -
0.6934 50000 7.6301 7.5130
0.6948 50100 7.6288 -
0.6962 50200 7.6338 -
0.6976 50300 7.6137 -
0.6990 50400 7.6473 -
0.7004 50500 7.589 7.5153
0.7017 50600 7.6076 -
0.7031 50700 7.5906 -
0.7045 50800 7.6102 -
0.7059 50900 7.6463 -
0.7073 51000 7.6695 7.5098
0.7087 51100 7.5947 -
0.7101 51200 7.6097 -
0.7115 51300 7.6397 -
0.7128 51400 7.6072 -
0.7142 51500 7.6112 7.5103
0.7156 51600 7.639 -
0.7170 51700 7.6188 -
0.7184 51800 7.6198 -
0.7198 51900 7.6229 -
0.7212 52000 7.6323 7.5050
0.7225 52100 7.6275 -
0.7239 52200 7.6012 -
0.7253 52300 7.6187 -
0.7267 52400 7.6191 -
0.7281 52500 7.6232 7.5109
0.7295 52600 7.6199 -
0.7309 52700 7.5819 -
0.7323 52800 7.6474 -
0.7336 52900 7.6124 -
0.7350 53000 7.622 7.5000
0.7364 53100 7.6184 -
0.7378 53200 7.5761 -
0.7392 53300 7.5943 -
0.7406 53400 7.6209 -
0.7420 53500 7.6065 7.5055
0.7434 53600 7.6065 -
0.7447 53700 7.6285 -
0.7461 53800 7.641 -
0.7475 53900 7.633 -
0.7489 54000 7.6184 7.4995
0.7503 54100 7.6198 -
0.7517 54200 7.6239 -
0.7531 54300 7.6087 -
0.7544 54400 7.6112 -
0.7558 54500 7.6372 7.4957
0.7572 54600 7.5938 -
0.7586 54700 7.6091 -
0.7600 54800 7.622 -
0.7614 54900 7.6052 -
0.7628 55000 7.5775 7.4967
0.7642 55100 7.6484 -
0.7655 55200 7.5911 -
0.7669 55300 7.5966 -
0.7683 55400 7.5708 -
0.7697 55500 7.5905 7.4959
0.7711 55600 7.5858 -
0.7725 55700 7.6255 -
0.7739 55800 7.6169 -
0.7752 55900 7.6159 -
0.7766 56000 7.584 7.4929
0.7780 56100 7.6364 -
0.7794 56200 7.558 -
0.7808 56300 7.6095 -
0.7822 56400 7.6049 -
0.7836 56500 7.6079 7.4934
0.7850 56600 7.584 -
0.7863 56700 7.5543 -
0.7877 56800 7.5971 -
0.7891 56900 7.6395 -
0.7905 57000 7.6006 7.4900
0.7919 57100 7.6199 -
0.7933 57200 7.5938 -
0.7947 57300 7.602 -
0.7961 57400 7.6317 -
0.7974 57500 7.6125 7.4891
0.7988 57600 7.6031 -
0.8002 57700 7.6153 -
0.8016 57800 7.6141 -
0.8030 57900 7.5877 -
0.8044 58000 7.6051 7.4896
0.8058 58100 7.6065 -
0.8071 58200 7.5677 -
0.8085 58300 7.6035 -
0.8099 58400 7.6071 -
0.8113 58500 7.6214 7.4800
0.8127 58600 7.5914 -
0.8141 58700 7.6038 -
0.8155 58800 7.6206 -
0.8169 58900 7.6222 -
0.8182 59000 7.6128 7.4801
0.8196 59100 7.6109 -
0.8210 59200 7.5591 -
0.8224 59300 7.5794 -
0.8238 59400 7.6161 -
0.8252 59500 7.5689 7.4824
0.8266 59600 7.6009 -
0.8279 59700 7.6121 -
0.8293 59800 7.5872 -
0.8307 59900 7.6111 -
0.8321 60000 7.5339 7.4813
0.8335 60100 7.5739 -
0.8349 60200 7.5565 -
0.8363 60300 7.5637 -
0.8377 60400 7.5997 -
0.8390 60500 7.592 7.4829
0.8404 60600 7.6004 -
0.8418 60700 7.6007 -
0.8432 60800 7.602 -
0.8446 60900 7.5755 -
0.8460 61000 7.5771 7.4795
0.8474 61100 7.6143 -
0.8488 61200 7.6088 -
0.8501 61300 7.5555 -
0.8515 61400 7.5841 -
0.8529 61500 7.5979 7.4762
0.8543 61600 7.6403 -
0.8557 61700 7.5607 -
0.8571 61800 7.6151 -
0.8585 61900 7.6179 -
0.8598 62000 7.6152 7.4767
0.8612 62100 7.598 -
0.8626 62200 7.6013 -
0.8640 62300 7.5577 -
0.8654 62400 7.6108 -
0.8668 62500 7.5869 7.4716
0.8682 62600 7.559 -
0.8696 62700 7.5963 -
0.8709 62800 7.5884 -
0.8723 62900 7.5922 -
0.8737 63000 7.5915 7.4683
0.8751 63100 7.5473 -
0.8765 63200 7.5829 -
0.8779 63300 7.6122 -
0.8793 63400 7.5863 -
0.8806 63500 7.5764 7.4707
0.8820 63600 7.6258 -
0.8834 63700 7.5862 -
0.8848 63800 7.5977 -
0.8862 63900 7.5708 -
0.8876 64000 7.6024 7.4675
0.8890 64100 7.5625 -
0.8904 64200 7.5474 -
0.8917 64300 7.5978 -
0.8931 64400 7.5505 -
0.8945 64500 7.5741 7.4678
0.8959 64600 7.5763 -
0.8973 64700 7.5528 -
0.8987 64800 7.5787 -
0.9001 64900 7.5631 -
0.9015 65000 7.582 7.4724
0.9028 65100 7.5931 -
0.9042 65200 7.5977 -
0.9056 65300 7.572 -
0.9070 65400 7.6331 -
0.9084 65500 7.5503 7.4660
0.9098 65600 7.5987 -
0.9112 65700 7.611 -
0.9125 65800 7.563 -
0.9139 65900 7.5699 -
0.9153 66000 7.5942 7.4677
0.9167 66100 7.6119 -
0.9181 66200 7.5873 -
0.9195 66300 7.6036 -
0.9209 66400 7.5827 -
0.9223 66500 7.6103 7.4649
0.9236 66600 7.604 -
0.9250 66700 7.6129 -
0.9264 66800 7.5668 -
0.9278 66900 7.5699 -
0.9292 67000 7.6045 7.4626
0.9306 67100 7.5973 -
0.9320 67200 7.5951 -
0.9333 67300 7.5635 -
0.9347 67400 7.5915 -
0.9361 67500 7.5577 7.4619
0.9375 67600 7.5921 -
0.9389 67700 7.5888 -
0.9403 67800 7.5838 -
0.9417 67900 7.5648 -
0.9431 68000 7.5537 7.4616
0.9444 68100 7.5809 -
0.9458 68200 7.5882 -
0.9472 68300 7.5372 -
0.9486 68400 7.584 -
0.9500 68500 7.5821 7.4607
0.9514 68600 7.5663 -
0.9528 68700 7.5734 -
0.9542 68800 7.6026 -
0.9555 68900 7.5928 -
0.9569 69000 7.5415 7.4615
0.9583 69100 7.5785 -
0.9597 69200 7.5925 -
0.9611 69300 7.5922 -
0.9625 69400 7.5559 -
0.9639 69500 7.5759 7.4594
0.9652 69600 7.5753 -
0.9666 69700 7.6039 -
0.9680 69800 7.5791 -
0.9694 69900 7.5905 -
0.9708 70000 7.57 7.4592
0.9722 70100 7.5804 -
0.9736 70200 7.5709 -
0.9750 70300 7.582 -
0.9763 70400 7.6233 -
0.9777 70500 7.556 7.4582
0.9791 70600 7.6028 -
0.9805 70700 7.6149 -
0.9819 70800 7.5763 -
0.9833 70900 7.5904 -
0.9847 71000 7.5607 7.4590
0.9860 71100 7.5826 -
0.9874 71200 7.5704 -
0.9888 71300 7.5656 -
0.9902 71400 7.5879 -
0.9916 71500 7.5943 7.4583
0.9930 71600 7.5359 -
0.9944 71700 7.6152 -
0.9958 71800 7.5791 -
0.9971 71900 7.5845 -
0.9985 72000 7.5487 7.4580
0.9999 72100 7.6124 -

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
-
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v17-pair_score

Finetuned
(743)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v17-pair_score

Paper for KhaledReda/all-MiniLM-L6-v17-pair_score