all-MiniLM-L6-v9-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v8 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'pasabah',
    'thermos mug',
    'sports tracksuit',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.8317, 0.6751],
#         [0.8317, 1.0000, 0.7012],
#         [0.6751, 0.7012, 1.0000]])

Training Details

Training Dataset

pairs_three_scores_v8

  • Dataset: pairs_three_scores_v8 at 7e8a1e6
  • Size: 9,164,180 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.6 tokens
    • max: 20 tokens
    • min: 3 tokens
    • mean: 5.79 tokens
    • max: 24 tokens
    • min: 0.15
    • mean: 0.43
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    booster face cleanser pizza cutter 0.24
    line sinker accessories 1.0
    tricovel cot bed mattress protector 0.28
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v8

  • Dataset: pairs_three_scores_v8 at 7e8a1e6
  • Size: 46,052 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.66 tokens
    • max: 21 tokens
    • min: 3 tokens
    • mean: 5.6 tokens
    • max: 41 tokens
    • min: 0.15
    • mean: 0.42
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    printed set crushed outfit 1.0
    value eva cosmetics serum 0.23
    zino shakes candy 0.27
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0014 100 12.0868
0.0028 200 11.6909
0.0042 300 11.6708
0.0056 400 11.0891
0.0070 500 10.6572
0.0084 600 10.3743
0.0098 700 10.1343
0.0112 800 9.8599
0.0126 900 9.6081
0.0140 1000 9.279
0.0154 1100 9.0853
0.0168 1200 8.8411
0.0182 1300 8.7139
0.0196 1400 8.6466
0.0210 1500 8.5972
0.0223 1600 8.5773
0.0237 1700 8.5516
0.0251 1800 8.5211
0.0265 1900 8.5185
0.0279 2000 8.5054
0.0293 2100 8.4975
0.0307 2200 8.4673
0.0321 2300 8.4779
0.0335 2400 8.4543
0.0349 2500 8.4363
0.0363 2600 8.4421
0.0377 2700 8.4267
0.0391 2800 8.4111
0.0405 2900 8.4245
0.0419 3000 8.3958
0.0433 3100 8.3894
0.0447 3200 8.3711
0.0461 3300 8.3785
0.0475 3400 8.3831
0.0489 3500 8.3727
0.0503 3600 8.3673
0.0517 3700 8.3408
0.0531 3800 8.3389
0.0545 3900 8.3482
0.0559 4000 8.3351
0.0573 4100 8.3255
0.0587 4200 8.3111
0.0601 4300 8.3122
0.0615 4400 8.3078
0.0629 4500 8.2959
0.0642 4600 8.2769
0.0656 4700 8.3087
0.0670 4800 8.2688
0.0684 4900 8.265
0.0698 5000 8.2562
0.0712 5100 8.2535
0.0726 5200 8.2526
0.0740 5300 8.2508
0.0754 5400 8.2453
0.0768 5500 8.2459
0.0782 5600 8.2298
0.0796 5700 8.2516
0.0810 5800 8.2284
0.0824 5900 8.2286
0.0838 6000 8.2158
0.0852 6100 8.2069
0.0866 6200 8.2051
0.0880 6300 8.1962
0.0894 6400 8.1916
0.0908 6500 8.2049
0.0922 6600 8.1833
0.0936 6700 8.173
0.0950 6800 8.1915
0.0964 6900 8.1757
0.0978 7000 8.163
0.0992 7100 8.1686
0.1006 7200 8.1532
0.1020 7300 8.1675
0.1034 7400 8.1389
0.1048 7500 8.131
0.1062 7600 8.1373
0.1075 7700 8.1451
0.1089 7800 8.1322
0.1103 7900 8.1343
0.1117 8000 8.121
0.1131 8100 8.1268
0.1145 8200 8.1238
0.1159 8300 8.1031
0.1173 8400 8.1116
0.1187 8500 8.1188
0.1201 8600 8.1112
0.1215 8700 8.087
0.1229 8800 8.0927
0.1243 8900 8.094
0.1257 9000 8.0809
0.1271 9100 8.0611
0.1285 9200 8.0846
0.1299 9300 8.0797
0.1313 9400 8.0445
0.1327 9500 8.073
0.1341 9600 8.0916
0.1355 9700 8.0523
0.1369 9800 8.0561
0.1383 9900 8.0524
0.1397 10000 8.0444
0.1411 10100 8.0722
0.1425 10200 8.0399
0.1439 10300 8.0502
0.1453 10400 8.0529
0.1467 10500 8.0426
0.1481 10600 8.0253
0.1494 10700 8.0427
0.1508 10800 8.0155
0.1522 10900 8.01
0.1536 11000 8.0176
0.1550 11100 8.0369
0.1564 11200 8.0222
0.1578 11300 8.0243
0.1592 11400 8.0045
0.1606 11500 7.9829
0.1620 11600 8.0131
0.1634 11700 8.0041
0.1648 11800 8.0254
0.1662 11900 8.0208
0.1676 12000 7.9967
0.1690 12100 7.9783
0.1704 12200 7.98
0.1718 12300 7.9696
0.1732 12400 7.9868
0.1746 12500 8.0012
0.1760 12600 7.9694
0.1774 12700 7.9781
0.1788 12800 7.9777
0.1802 12900 7.9891
0.1816 13000 7.9699
0.1830 13100 7.9706
0.1844 13200 7.9755
0.1858 13300 7.9648
0.1872 13400 7.9665
0.1886 13500 7.9624
0.1900 13600 7.9794
0.1914 13700 7.9621
0.1927 13800 7.948
0.1941 13900 7.9598
0.1955 14000 7.9402
0.1969 14100 7.9515
0.1983 14200 7.961
0.1997 14300 7.9214
0.2011 14400 7.9587
0.2025 14500 7.9519
0.2039 14600 7.9504
0.2053 14700 7.9676
0.2067 14800 7.9341
0.2081 14900 7.957
0.2095 15000 7.9331
0.2109 15100 7.9692
0.2123 15200 7.9489
0.2137 15300 7.9541
0.2151 15400 7.9416
0.2165 15500 7.9215
0.2179 15600 7.9275
0.2193 15700 7.9034
0.2207 15800 7.8911
0.2221 15900 7.9397
0.2235 16000 7.9283
0.2249 16100 7.9132
0.2263 16200 7.9
0.2277 16300 7.8765
0.2291 16400 7.9209
0.2305 16500 7.9016
0.2319 16600 7.894
0.2333 16700 7.9451
0.2346 16800 7.8843
0.2360 16900 7.8795
0.2374 17000 7.9126
0.2388 17100 7.9154
0.2402 17200 7.9338
0.2416 17300 7.9
0.2430 17400 7.8878
0.2444 17500 7.8784
0.2458 17600 7.8853
0.2472 17700 7.8855
0.2486 17800 7.9002
0.2500 17900 7.8711
0.2514 18000 7.8943
0.2528 18100 7.8591
0.2542 18200 7.8622
0.2556 18300 7.8705
0.2570 18400 7.8734
0.2584 18500 7.8603
0.2598 18600 7.8778
0.2612 18700 7.8524
0.2626 18800 7.8577
0.2640 18900 7.8777
0.2654 19000 7.8728
0.2668 19100 7.8567
0.2682 19200 7.8359
0.2696 19300 7.8707
0.2710 19400 7.8708
0.2724 19500 7.8435
0.2738 19600 7.8486
0.2752 19700 7.8577
0.2766 19800 7.8559
0.2779 19900 7.8602
0.2793 20000 7.8401
0.2807 20100 7.8515
0.2821 20200 7.8539
0.2835 20300 7.8699
0.2849 20400 7.8434
0.2863 20500 7.8412
0.2877 20600 7.8419
0.2891 20700 7.8397
0.2905 20800 7.8539
0.2919 20900 7.8482
0.2933 21000 7.8632
0.2947 21100 7.8482
0.2961 21200 7.8394
0.2975 21300 7.8587
0.2989 21400 7.8625
0.3003 21500 7.8516
0.3017 21600 7.8321
0.3031 21700 7.8306
0.3045 21800 7.8226
0.3059 21900 7.8306
0.3073 22000 7.8271
0.3087 22100 7.8257
0.3101 22200 7.8387
0.3115 22300 7.8185
0.3129 22400 7.8575
0.3143 22500 7.8248
0.3157 22600 7.8203
0.3171 22700 7.8059
0.3185 22800 7.8356
0.3199 22900 7.8619
0.3212 23000 7.8178
0.3226 23100 7.816
0.3240 23200 7.8296
0.3254 23300 7.8028
0.3268 23400 7.8112
0.3282 23500 7.8059
0.3296 23600 7.8179
0.3310 23700 7.7965
0.3324 23800 7.8181
0.3338 23900 7.8142
0.3352 24000 7.7948
0.3366 24100 7.7777
0.3380 24200 7.8166
0.3394 24300 7.8028
0.3408 24400 7.828
0.3422 24500 7.8129
0.3436 24600 7.7927
0.3450 24700 7.805
0.3464 24800 7.8153
0.3478 24900 7.7955
0.3492 25000 7.8043
0.3506 25100 7.7807
0.3520 25200 7.8078
0.3534 25300 7.7599
0.3548 25400 7.7659
0.3562 25500 7.818
0.3576 25600 7.7496
0.3590 25700 7.7957
0.3604 25800 7.7679
0.3618 25900 7.8055
0.3631 26000 7.7885
0.3645 26100 7.7795
0.3659 26200 7.7752
0.3673 26300 7.7882
0.3687 26400 7.7951
0.3701 26500 7.7913
0.3715 26600 7.8002
0.3729 26700 7.7393
0.3743 26800 7.7809
0.3757 26900 7.7339
0.3771 27000 7.778
0.3785 27100 7.7895
0.3799 27200 7.8127
0.3813 27300 7.7827
0.3827 27400 7.7493
0.3841 27500 7.7709
0.3855 27600 7.7423
0.3869 27700 7.8034
0.3883 27800 7.759
0.3897 27900 7.7835
0.3911 28000 7.7631
0.3925 28100 7.7889
0.3939 28200 7.7357
0.3953 28300 7.7858
0.3967 28400 7.8136
0.3981 28500 7.7568
0.3995 28600 7.753
0.4009 28700 7.7624
0.4023 28800 7.765
0.4037 28900 7.7551
0.4051 29000 7.7631
0.4064 29100 7.7445
0.4078 29200 7.7821
0.4092 29300 7.7269
0.4106 29400 7.7565
0.4120 29500 7.7447
0.4134 29600 7.7304
0.4148 29700 7.7389
0.4162 29800 7.7275
0.4176 29900 7.7383
0.4190 30000 7.7789
0.4204 30100 7.7663
0.4218 30200 7.7539
0.4232 30300 7.7569
0.4246 30400 7.7637
0.4260 30500 7.7446
0.4274 30600 7.7284
0.4288 30700 7.7419
0.4302 30800 7.7533
0.4316 30900 7.7624
0.4330 31000 7.7265
0.4344 31100 7.7562
0.4358 31200 7.7201
0.4372 31300 7.7427
0.4386 31400 7.7415
0.4400 31500 7.7672
0.4414 31600 7.7433
0.4428 31700 7.7678
0.4442 31800 7.7108
0.4456 31900 7.7638
0.4470 32000 7.7389
0.4483 32100 7.7439
0.4497 32200 7.7312
0.4511 32300 7.7271
0.4525 32400 7.7388
0.4539 32500 7.7487
0.4553 32600 7.7087
0.4567 32700 7.7529
0.4581 32800 7.7288
0.4595 32900 7.7476
0.4609 33000 7.717
0.4623 33100 7.762
0.4637 33200 7.7271
0.4651 33300 7.7478
0.4665 33400 7.7026
0.4679 33500 7.7697
0.4693 33600 7.6938
0.4707 33700 7.7347
0.4721 33800 7.7068
0.4735 33900 7.711
0.4749 34000 7.7346
0.4763 34100 7.7116
0.4777 34200 7.7029
0.4791 34300 7.7197
0.4805 34400 7.7398
0.4819 34500 7.7208
0.4833 34600 7.6952
0.4847 34700 7.6922
0.4861 34800 7.6887
0.4875 34900 7.7389
0.4889 35000 7.7247
0.4903 35100 7.7315
0.4916 35200 7.6813
0.4930 35300 7.7163
0.4944 35400 7.7209
0.4958 35500 7.6907
0.4972 35600 7.666
0.4986 35700 7.703
0.5000 35800 7.7209
0.5014 35900 7.6848
0.5028 36000 7.7093
0.5042 36100 7.7143
0.5056 36200 7.6982
0.5070 36300 7.701
0.5084 36400 7.6692
0.5098 36500 7.7381
0.5112 36600 7.7162
0.5126 36700 7.7257
0.5140 36800 7.6956
0.5154 36900 7.726
0.5168 37000 7.7005
0.5182 37100 7.6989
0.5196 37200 7.6922
0.5210 37300 7.6886
0.5224 37400 7.7063
0.5238 37500 7.7059
0.5252 37600 7.6855
0.5266 37700 7.7311
0.5280 37800 7.6952
0.5294 37900 7.7072
0.5308 38000 7.6702
0.5322 38100 7.6954
0.5335 38200 7.675
0.5349 38300 7.6638
0.5363 38400 7.7127
0.5377 38500 7.6803
0.5391 38600 7.7213
0.5405 38700 7.6807
0.5419 38800 7.6737
0.5433 38900 7.7128
0.5447 39000 7.6819
0.5461 39100 7.6947
0.5475 39200 7.6457
0.5489 39300 7.654
0.5503 39400 7.6518
0.5517 39500 7.7081
0.5531 39600 7.6922
0.5545 39700 7.6965
0.5559 39800 7.6782
0.5573 39900 7.6472
0.5587 40000 7.7049
0.5601 40100 7.6999
0.5615 40200 7.6978
0.5629 40300 7.6853
0.5643 40400 7.6937
0.5657 40500 7.6953
0.5671 40600 7.6697
0.5685 40700 7.6675
0.5699 40800 7.6498
0.5713 40900 7.703
0.5727 41000 7.6775
0.5741 41100 7.6664
0.5755 41200 7.6843
0.5768 41300 7.6783
0.5782 41400 7.6657
0.5796 41500 7.6741
0.5810 41600 7.6617
0.5824 41700 7.654
0.5838 41800 7.6979
0.5852 41900 7.6644
0.5866 42000 7.6663
0.5880 42100 7.7044
0.5894 42200 7.6678
0.5908 42300 7.6679
0.5922 42400 7.6679
0.5936 42500 7.6712
0.5950 42600 7.702
0.5964 42700 7.6401
0.5978 42800 7.6613
0.5992 42900 7.6868
0.6006 43000 7.6424
0.6020 43100 7.6654
0.6034 43200 7.6769
0.6048 43300 7.6645
0.6062 43400 7.6889
0.6076 43500 7.6552
0.6090 43600 7.67
0.6104 43700 7.6811
0.6118 43800 7.6618
0.6132 43900 7.6628
0.6146 44000 7.6413
0.6160 44100 7.6378
0.6174 44200 7.6836
0.6187 44300 7.6842
0.6201 44400 7.6445
0.6215 44500 7.6639
0.6229 44600 7.636
0.6243 44700 7.6571
0.6257 44800 7.662
0.6271 44900 7.6481
0.6285 45000 7.6761
0.6299 45100 7.6551
0.6313 45200 7.6471
0.6327 45300 7.65
0.6341 45400 7.6231
0.6355 45500 7.6314
0.6369 45600 7.6243
0.6383 45700 7.6414
0.6397 45800 7.6451
0.6411 45900 7.6482
0.6425 46000 7.6597
0.6439 46100 7.6402
0.6453 46200 7.6414
0.6467 46300 7.6425
0.6481 46400 7.6182
0.6495 46500 7.6251
0.6509 46600 7.6422
0.6523 46700 7.6152
0.6537 46800 7.6418
0.6551 46900 7.6781
0.6565 47000 7.6551
0.6579 47100 7.6306
0.6593 47200 7.6366
0.6607 47300 7.6708
0.6620 47400 7.6668
0.6634 47500 7.6296
0.6648 47600 7.6555
0.6662 47700 7.6232
0.6676 47800 7.6476
0.6690 47900 7.6504
0.6704 48000 7.6564
0.6718 48100 7.6696
0.6732 48200 7.6486
0.6746 48300 7.6278
0.6760 48400 7.6414
0.6774 48500 7.6713
0.6788 48600 7.6578
0.6802 48700 7.5976
0.6816 48800 7.6387
0.6830 48900 7.6214
0.6844 49000 7.6243
0.6858 49100 7.6437
0.6872 49200 7.6052
0.6886 49300 7.6088
0.6900 49400 7.6297
0.6914 49500 7.6474
0.6928 49600 7.6433
0.6942 49700 7.6289
0.6956 49800 7.6657
0.6970 49900 7.6593
0.6984 50000 7.6708
0.6998 50100 7.6413
0.7012 50200 7.6112
0.7026 50300 7.6106
0.7039 50400 7.6576
0.7053 50500 7.6476
0.7067 50600 7.6078
0.7081 50700 7.5909
0.7095 50800 7.6175
0.7109 50900 7.6501
0.7123 51000 7.623
0.7137 51100 7.6103
0.7151 51200 7.65
0.7165 51300 7.6565
0.7179 51400 7.6364
0.7193 51500 7.6286
0.7207 51600 7.6663
0.7221 51700 7.6359
0.7235 51800 7.6285
0.7249 51900 7.6467
0.7263 52000 7.6214
0.7277 52100 7.6249
0.7291 52200 7.5814
0.7305 52300 7.615
0.7319 52400 7.6581
0.7333 52500 7.6419
0.7347 52600 7.6426
0.7361 52700 7.6306
0.7375 52800 7.58
0.7389 52900 7.5902
0.7403 53000 7.6334
0.7417 53100 7.6168
0.7431 53200 7.6451
0.7445 53300 7.6189
0.7459 53400 7.6671
0.7472 53500 7.6401
0.7486 53600 7.6363
0.7500 53700 7.6032
0.7514 53800 7.6506
0.7528 53900 7.6238
0.7542 54000 7.5901
0.7556 54100 7.6239
0.7570 54200 7.6114
0.7584 54300 7.6005
0.7598 54400 7.6137
0.7612 54500 7.6069
0.7626 54600 7.6223
0.7640 54700 7.6301
0.7654 54800 7.628
0.7668 54900 7.6296
0.7682 55000 7.605
0.7696 55100 7.6194
0.7710 55200 7.6252
0.7724 55300 7.6269
0.7738 55400 7.5882
0.7752 55500 7.5711
0.7766 55600 7.6398
0.7780 55700 7.6142
0.7794 55800 7.5884
0.7808 55900 7.6556
0.7822 56000 7.6153
0.7836 56100 7.6223
0.7850 56200 7.6424
0.7864 56300 7.6269
0.7878 56400 7.612
0.7892 56500 7.5898
0.7905 56600 7.6565
0.7919 56700 7.5972
0.7933 56800 7.6461
0.7947 56900 7.6026
0.7961 57000 7.6218
0.7975 57100 7.6082
0.7989 57200 7.6322
0.8003 57300 7.6439
0.8017 57400 7.6399
0.8031 57500 7.5862
0.8045 57600 7.5764
0.8059 57700 7.5985
0.8073 57800 7.6166
0.8087 57900 7.6592
0.8101 58000 7.6213
0.8115 58100 7.5822
0.8129 58200 7.6558
0.8143 58300 7.6409
0.8157 58400 7.5881
0.8171 58500 7.6038
0.8185 58600 7.6504
0.8199 58700 7.5968
0.8213 58800 7.594
0.8227 58900 7.6114
0.8241 59000 7.6423
0.8255 59100 7.6116
0.8269 59200 7.6188
0.8283 59300 7.5876
0.8297 59400 7.5823
0.8311 59500 7.6267
0.8324 59600 7.653
0.8338 59700 7.6051
0.8352 59800 7.6293
0.8366 59900 7.6038
0.8380 60000 7.6025
0.8394 60100 7.6169
0.8408 60200 7.619
0.8422 60300 7.6105
0.8436 60400 7.5876
0.8450 60500 7.5915
0.8464 60600 7.6217
0.8478 60700 7.5864
0.8492 60800 7.6194
0.8506 60900 7.5685
0.8520 61000 7.6145
0.8534 61100 7.6252
0.8548 61200 7.6284
0.8562 61300 7.631
0.8576 61400 7.6127
0.8590 61500 7.6132
0.8604 61600 7.6115
0.8618 61700 7.6493
0.8632 61800 7.6306
0.8646 61900 7.5841
0.8660 62000 7.6069
0.8674 62100 7.5825
0.8688 62200 7.5968
0.8702 62300 7.6097
0.8716 62400 7.5914
0.8730 62500 7.6144
0.8744 62600 7.6123
0.8757 62700 7.5844
0.8771 62800 7.573
0.8785 62900 7.5754
0.8799 63000 7.59
0.8813 63100 7.5754
0.8827 63200 7.5935
0.8841 63300 7.6287
0.8855 63400 7.5785
0.8869 63500 7.6004
0.8883 63600 7.5884
0.8897 63700 7.5901
0.8911 63800 7.5923
0.8925 63900 7.5834
0.8939 64000 7.6133
0.8953 64100 7.5938
0.8967 64200 7.6232
0.8981 64300 7.5682
0.8995 64400 7.618
0.9009 64500 7.596
0.9023 64600 7.5645
0.9037 64700 7.6386
0.9051 64800 7.5777
0.9065 64900 7.5788
0.9079 65000 7.5889
0.9093 65100 7.5699
0.9107 65200 7.5495
0.9121 65300 7.5881
0.9135 65400 7.5911
0.9149 65500 7.6214
0.9163 65600 7.5886
0.9176 65700 7.5784
0.9190 65800 7.5975
0.9204 65900 7.591
0.9218 66000 7.5844
0.9232 66100 7.5797
0.9246 66200 7.5775
0.9260 66300 7.574
0.9274 66400 7.5608
0.9288 66500 7.6209
0.9302 66600 7.5795
0.9316 66700 7.6324
0.9330 66800 7.5482
0.9344 66900 7.5494
0.9358 67000 7.5864
0.9372 67100 7.6184
0.9386 67200 7.6167
0.9400 67300 7.5876
0.9414 67400 7.5567
0.9428 67500 7.5763
0.9442 67600 7.5748
0.9456 67700 7.572
0.9470 67800 7.617
0.9484 67900 7.571
0.9498 68000 7.5916
0.9512 68100 7.5612
0.9526 68200 7.6116
0.9540 68300 7.5878
0.9554 68400 7.6086
0.9568 68500 7.6044
0.9582 68600 7.6018
0.9596 68700 7.5985
0.9609 68800 7.6214
0.9623 68900 7.568
0.9637 69000 7.5951
0.9651 69100 7.572
0.9665 69200 7.5724
0.9679 69300 7.5617
0.9693 69400 7.6132
0.9707 69500 7.5599
0.9721 69600 7.583
0.9735 69700 7.5867
0.9749 69800 7.606
0.9763 69900 7.5613
0.9777 70000 7.5896
0.9791 70100 7.5707
0.9805 70200 7.575
0.9819 70300 7.61
0.9833 70400 7.5958
0.9847 70500 7.617
0.9861 70600 7.5813
0.9875 70700 7.5614
0.9889 70800 7.5873
0.9903 70900 7.577
0.9917 71000 7.5563
0.9931 71100 7.6124
0.9945 71200 7.5421
0.9959 71300 7.572
0.9973 71400 7.5634
0.9987 71500 7.5796

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
-
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v9-pair_score

Finetuned
(743)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v9-pair_score

Paper for KhaledReda/all-MiniLM-L6-v9-pair_score