all-MiniLM-L6-v5-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v5 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'decatoons sweatshirt',
    'golden body splash',
    'speck',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.5229, 0.4606],
#         [0.5229, 1.0000, 0.9228],
#         [0.4606, 0.9228, 1.0000]])

Training Details

Training Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at e3b7dac
  • Size: 9,471,728 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.32 tokens
    • max: 115 tokens
    • min: 3 tokens
    • mean: 5.88 tokens
    • max: 45 tokens
    • min: 0.16
    • mean: 0.46
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    handmade plant pot modern pot hanger 1.0
    27 cm 2 salt shakers 2 pepper shakers 1 oval patterned skirt 0.2482760548591613
    polyester shirt medjool dates tray 0.2358989119529724
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v5

  • Dataset: pairs_three_scores_v5 at e3b7dac
  • Size: 47,597 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.83 tokens
    • max: 115 tokens
    • min: 3 tokens
    • mean: 6.08 tokens
    • max: 115 tokens
    • min: 0.17
    • mean: 0.44
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    homeboy attire tee non shrinkable tshirt 1.0
    full zip top mini printer camera 0.2172942459583282
    seafood dinner set sauteed vegetables chicken 0.2605650424957275
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0014 100 11.8378
0.0027 200 11.7624
0.0041 300 11.3427
0.0054 400 11.0256
0.0068 500 10.6613
0.0081 600 10.3128
0.0095 700 10.1329
0.0108 800 9.7423
0.0122 900 9.5806
0.0135 1000 9.4322
0.0149 1100 9.1702
0.0162 1200 8.94
0.0176 1300 8.7885
0.0189 1400 8.6533
0.0203 1500 8.5641
0.0216 1600 8.5028
0.0230 1700 8.4757
0.0243 1800 8.447
0.0257 1900 8.4225
0.0270 2000 8.3852
0.0284 2100 8.373
0.0297 2200 8.3521
0.0311 2300 8.3145
0.0324 2400 8.2952
0.0338 2500 8.2909
0.0351 2600 8.2791
0.0365 2700 8.2536
0.0378 2800 8.2541
0.0392 2900 8.2286
0.0405 3000 8.209
0.0419 3100 8.2127
0.0432 3200 8.207
0.0446 3300 8.1622
0.0459 3400 8.1857
0.0473 3500 8.1763
0.0486 3600 8.1709
0.0500 3700 8.1427
0.0514 3800 8.1257
0.0527 3900 8.1322
0.0541 4000 8.0962
0.0554 4100 8.0962
0.0568 4200 8.0966
0.0581 4300 8.0912
0.0595 4400 8.0651
0.0608 4500 8.0821
0.0622 4600 8.067
0.0635 4700 8.0657
0.0649 4800 8.0684
0.0662 4900 8.0492
0.0676 5000 8.043
0.0689 5100 8.0479
0.0703 5200 8.0375
0.0716 5300 8.0029
0.0730 5400 8.007
0.0743 5500 7.9869
0.0757 5600 8.0095
0.0770 5700 7.9892
0.0784 5800 7.9696
0.0797 5900 7.9933
0.0811 6000 7.9824
0.0824 6100 7.9981
0.0838 6200 7.9346
0.0851 6300 7.9903
0.0865 6400 7.9311
0.0878 6500 7.9421
0.0892 6600 7.9337
0.0905 6700 7.9645
0.0919 6800 7.9563
0.0932 6900 7.9159
0.0946 7000 7.9508
0.0959 7100 7.9474
0.0973 7200 7.8983
0.0987 7300 7.9098
0.1000 7400 7.9041
0.1014 7500 7.8927
0.1027 7600 7.8949
0.1041 7700 7.8995
0.1054 7800 7.8777
0.1068 7900 7.8628
0.1081 8000 7.9323
0.1095 8100 7.8753
0.1108 8200 7.8704
0.1122 8300 7.8771
0.1135 8400 7.8576
0.1149 8500 7.8735
0.1162 8600 7.8419
0.1176 8700 7.8649
0.1189 8800 7.8427
0.1203 8900 7.9074
0.1216 9000 7.8505
0.1230 9100 7.8501
0.1243 9200 7.8634
0.1257 9300 7.8423
0.1270 9400 7.8212
0.1284 9500 7.8107
0.1297 9600 7.8412
0.1311 9700 7.8311
0.1324 9800 7.7941
0.1338 9900 7.83
0.1351 10000 7.8489
0.1365 10100 7.7812
0.1378 10200 7.796
0.1392 10300 7.8704
0.1405 10400 7.8019
0.1419 10500 7.7884
0.1432 10600 7.7781
0.1446 10700 7.7941
0.1459 10800 7.8013
0.1473 10900 7.7522
0.1487 11000 7.7881
0.1500 11100 7.7688
0.1514 11200 7.7866
0.1527 11300 7.7803
0.1541 11400 7.7807
0.1554 11500 7.8013
0.1568 11600 7.8306
0.1581 11700 7.7936
0.1595 11800 7.7505
0.1608 11900 7.7511
0.1622 12000 7.7628
0.1635 12100 7.7551
0.1649 12200 7.7326
0.1662 12300 7.7594
0.1676 12400 7.7461
0.1689 12500 7.773
0.1703 12600 7.7329
0.1716 12700 7.7626
0.1730 12800 7.7062
0.1743 12900 7.7343
0.1757 13000 7.7395
0.1770 13100 7.7609
0.1784 13200 7.7316
0.1797 13300 7.7167
0.1811 13400 7.7267
0.1824 13500 7.6934
0.1838 13600 7.7793
0.1851 13700 7.7006
0.1865 13800 7.6876
0.1878 13900 7.7137
0.1892 14000 7.7632
0.1905 14100 7.708
0.1919 14200 7.7091
0.1932 14300 7.7134
0.1946 14400 7.7181
0.1960 14500 7.7334
0.1973 14600 7.6954
0.1987 14700 7.7124
0.2000 14800 7.7166
0.2014 14900 7.683
0.2027 15000 7.6902
0.2041 15100 7.7494
0.2054 15200 7.6941
0.2068 15300 7.677
0.2081 15400 7.6881
0.2095 15500 7.6612
0.2108 15600 7.6961
0.2122 15700 7.6918
0.2135 15800 7.6793
0.2149 15900 7.6905
0.2162 16000 7.6369
0.2176 16100 7.6916
0.2189 16200 7.7363
0.2203 16300 7.6575
0.2216 16400 7.6888
0.2230 16500 7.6307
0.2243 16600 7.6252
0.2257 16700 7.721
0.2270 16800 7.645
0.2284 16900 7.6836
0.2297 17000 7.6744
0.2311 17100 7.6473
0.2324 17200 7.6733
0.2338 17300 7.6536
0.2351 17400 7.6584
0.2365 17500 7.6492
0.2378 17600 7.6447
0.2392 17700 7.6577
0.2405 17800 7.6363
0.2419 17900 7.6476
0.2432 18000 7.6177
0.2446 18100 7.6366
0.2460 18200 7.6942
0.2473 18300 7.6242
0.2487 18400 7.6242
0.2500 18500 7.6456
0.2514 18600 7.6021
0.2527 18700 7.6465
0.2541 18800 7.6205
0.2554 18900 7.6511
0.2568 19000 7.6179
0.2581 19100 7.6627
0.2595 19200 7.6663
0.2608 19300 7.639
0.2622 19400 7.5946
0.2635 19500 7.6612
0.2649 19600 7.5797
0.2662 19700 7.6068
0.2676 19800 7.581
0.2689 19900 7.6087
0.2703 20000 7.6393
0.2716 20100 7.6605
0.2730 20200 7.6378
0.2743 20300 7.5947
0.2757 20400 7.6598
0.2770 20500 7.6269
0.2784 20600 7.6326
0.2797 20700 7.5752
0.2811 20800 7.6066
0.2824 20900 7.5861
0.2838 21000 7.599
0.2851 21100 7.5799
0.2865 21200 7.6194
0.2878 21300 7.6307
0.2892 21400 7.5502
0.2905 21500 7.6016
0.2919 21600 7.6155
0.2933 21700 7.5933
0.2946 21800 7.6043
0.2960 21900 7.5941
0.2973 22000 7.5605
0.2987 22100 7.5597
0.3000 22200 7.557
0.3014 22300 7.6361
0.3027 22400 7.6308
0.3041 22500 7.526
0.3054 22600 7.59
0.3068 22700 7.5781
0.3081 22800 7.5667
0.3095 22900 7.572
0.3108 23000 7.6571
0.3122 23100 7.5984
0.3135 23200 7.5498
0.3149 23300 7.5993
0.3162 23400 7.5897
0.3176 23500 7.5863
0.3189 23600 7.5691
0.3203 23700 7.5601
0.3216 23800 7.5377
0.3230 23900 7.5506
0.3243 24000 7.5455
0.3257 24100 7.6028
0.3270 24200 7.5328
0.3284 24300 7.5707
0.3297 24400 7.5704
0.3311 24500 7.5486
0.3324 24600 7.567
0.3338 24700 7.5216
0.3351 24800 7.5155
0.3365 24900 7.5873
0.3378 25000 7.5496
0.3392 25100 7.5382
0.3405 25200 7.5796
0.3419 25300 7.5426
0.3433 25400 7.5227
0.3446 25500 7.5458
0.3460 25600 7.5463
0.3473 25700 7.5455
0.3487 25800 7.6036
0.3500 25900 7.5756
0.3514 26000 7.5844
0.3527 26100 7.5282
0.3541 26200 7.5571
0.3554 26300 7.5062
0.3568 26400 7.4905
0.3581 26500 7.5028
0.3595 26600 7.5987
0.3608 26700 7.5066
0.3622 26800 7.5459
0.3635 26900 7.5377
0.3649 27000 7.6023
0.3662 27100 7.5651
0.3676 27200 7.5029
0.3689 27300 7.5435
0.3703 27400 7.5412
0.3716 27500 7.511
0.3730 27600 7.5383
0.3743 27700 7.5588
0.3757 27800 7.5981
0.3770 27900 7.5242
0.3784 28000 7.517
0.3797 28100 7.5661
0.3811 28200 7.5237
0.3824 28300 7.5481
0.3838 28400 7.5301
0.3851 28500 7.537
0.3865 28600 7.5413
0.3878 28700 7.5748
0.3892 28800 7.5182
0.3906 28900 7.5154
0.3919 29000 7.6161
0.3933 29100 7.5144
0.3946 29200 7.5333
0.3960 29300 7.4949
0.3973 29400 7.524
0.3987 29500 7.5728
0.4000 29600 7.48
0.4014 29700 7.5132
0.4027 29800 7.5018
0.4041 29900 7.4884
0.4054 30000 7.4754
0.4068 30100 7.5079
0.4081 30200 7.4708
0.4095 30300 7.5257
0.4108 30400 7.5593
0.4122 30500 7.4834
0.4135 30600 7.5115
0.4149 30700 7.5214
0.4162 30800 7.4921
0.4176 30900 7.4963
0.4189 31000 7.5094
0.4203 31100 7.5182
0.4216 31200 7.4958
0.4230 31300 7.5268
0.4243 31400 7.485
0.4257 31500 7.5422
0.4270 31600 7.5098
0.4284 31700 7.4629
0.4297 31800 7.5025
0.4311 31900 7.4856
0.4324 32000 7.4503
0.4338 32100 7.5018
0.4351 32200 7.5354
0.4365 32300 7.4489
0.4378 32400 7.4415
0.4392 32500 7.4926
0.4406 32600 7.4681
0.4419 32700 7.5475
0.4433 32800 7.484
0.4446 32900 7.4802
0.4460 33000 7.5081
0.4473 33100 7.4974
0.4487 33200 7.4699
0.4500 33300 7.4747
0.4514 33400 7.5482
0.4527 33500 7.4865
0.4541 33600 7.5148
0.4554 33700 7.4545
0.4568 33800 7.4723
0.4581 33900 7.4998
0.4595 34000 7.4819
0.4608 34100 7.4755
0.4622 34200 7.4971
0.4635 34300 7.4708
0.4649 34400 7.488
0.4662 34500 7.4738
0.4676 34600 7.4476
0.4689 34700 7.4738
0.4703 34800 7.4539
0.4716 34900 7.5857
0.4730 35000 7.4753
0.4743 35100 7.5097
0.4757 35200 7.4847
0.4770 35300 7.4636
0.4784 35400 7.5168
0.4797 35500 7.5223
0.4811 35600 7.4795
0.4824 35700 7.4736
0.4838 35800 7.4432
0.4851 35900 7.4796
0.4865 36000 7.4921
0.4879 36100 7.5042
0.4892 36200 7.4443
0.4906 36300 7.5261
0.4919 36400 7.5132
0.4933 36500 7.481
0.4946 36600 7.4937
0.4960 36700 7.4902
0.4973 36800 7.4887
0.4987 36900 7.4463
0.5000 37000 7.5135
0.5014 37100 7.4418
0.5027 37200 7.4454
0.5041 37300 7.4725
0.5054 37400 7.453
0.5068 37500 7.4825
0.5081 37600 7.5135
0.5095 37700 7.4328
0.5108 37800 7.4844
0.5122 37900 7.496
0.5135 38000 7.4473
0.5149 38100 7.4416
0.5162 38200 7.4417
0.5176 38300 7.5149
0.5189 38400 7.4074
0.5203 38500 7.423
0.5216 38600 7.4541
0.5230 38700 7.4594
0.5243 38800 7.4227
0.5257 38900 7.4527
0.5270 39000 7.4359
0.5284 39100 7.4886
0.5297 39200 7.4625
0.5311 39300 7.4325
0.5324 39400 7.4447
0.5338 39500 7.4866
0.5351 39600 7.4217
0.5365 39700 7.4961
0.5379 39800 7.461
0.5392 39900 7.4495
0.5406 40000 7.5045
0.5419 40100 7.4224
0.5433 40200 7.4593
0.5446 40300 7.5332
0.5460 40400 7.4479
0.5473 40500 7.4379
0.5487 40600 7.4665
0.5500 40700 7.4144
0.5514 40800 7.3977
0.5527 40900 7.4426
0.5541 41000 7.432
0.5554 41100 7.3984
0.5568 41200 7.4694
0.5581 41300 7.4601
0.5595 41400 7.4182
0.5608 41500 7.4924
0.5622 41600 7.4906
0.5635 41700 7.4558
0.5649 41800 7.485
0.5662 41900 7.4333
0.5676 42000 7.4057
0.5689 42100 7.4764
0.5703 42200 7.4279
0.5716 42300 7.4602
0.5730 42400 7.4212
0.5743 42500 7.4861
0.5757 42600 7.4304
0.5770 42700 7.4188
0.5784 42800 7.4317
0.5797 42900 7.4588
0.5811 43000 7.4209
0.5824 43100 7.4087
0.5838 43200 7.4721
0.5852 43300 7.4536
0.5865 43400 7.4003
0.5879 43500 7.4272
0.5892 43600 7.4602
0.5906 43700 7.419
0.5919 43800 7.4269
0.5933 43900 7.4247
0.5946 44000 7.4258
0.5960 44100 7.4299
0.5973 44200 7.4158
0.5987 44300 7.403
0.6000 44400 7.4191
0.6014 44500 7.4709
0.6027 44600 7.486
0.6041 44700 7.4291
0.6054 44800 7.4877
0.6068 44900 7.45
0.6081 45000 7.3647
0.6095 45100 7.426
0.6108 45200 7.3944
0.6122 45300 7.4202
0.6135 45400 7.4041
0.6149 45500 7.4327
0.6162 45600 7.376
0.6176 45700 7.4014
0.6189 45800 7.3803
0.6203 45900 7.4183
0.6216 46000 7.5413
0.6230 46100 7.4593
0.6243 46200 7.4388
0.6257 46300 7.4631
0.6270 46400 7.4415
0.6284 46500 7.4411
0.6297 46600 7.5313
0.6311 46700 7.4263
0.6324 46800 7.3888
0.6338 46900 7.3737
0.6352 47000 7.3947
0.6365 47100 7.3765
0.6379 47200 7.4102
0.6392 47300 7.4342
0.6406 47400 7.3927
0.6419 47500 7.3957
0.6433 47600 7.433
0.6446 47700 7.4502
0.6460 47800 7.3924
0.6473 47900 7.3984
0.6487 48000 7.4204
0.6500 48100 7.4052
0.6514 48200 7.3883
0.6527 48300 7.4162
0.6541 48400 7.4037
0.6554 48500 7.4046
0.6568 48600 7.4245
0.6581 48700 7.4183
0.6595 48800 7.3896
0.6608 48900 7.4323
0.6622 49000 7.42
0.6635 49100 7.3829
0.6649 49200 7.3866
0.6662 49300 7.3983
0.6676 49400 7.4331
0.6689 49500 7.4524
0.6703 49600 7.3913
0.6716 49700 7.3683
0.6730 49800 7.5123
0.6743 49900 7.464
0.6757 50000 7.4098
0.6770 50100 7.4148
0.6784 50200 7.4082
0.6797 50300 7.3967
0.6811 50400 7.3874
0.6825 50500 7.4043
0.6838 50600 7.4128
0.6852 50700 7.4315
0.6865 50800 7.3707
0.6879 50900 7.3922
0.6892 51000 7.3879
0.6906 51100 7.3663
0.6919 51200 7.351
0.6933 51300 7.4023
0.6946 51400 7.4772
0.6960 51500 7.4031
0.6973 51600 7.4171
0.6987 51700 7.4815
0.7000 51800 7.4803
0.7014 51900 7.4689
0.7027 52000 7.4723
0.7041 52100 7.378
0.7054 52200 7.3806
0.7068 52300 7.4055
0.7081 52400 7.3862
0.7095 52500 7.3901
0.7108 52600 7.3625
0.7122 52700 7.4261
0.7135 52800 7.4476
0.7149 52900 7.4049
0.7162 53000 7.5797
0.7176 53100 7.4061
0.7189 53200 7.3428
0.7203 53300 7.3677
0.7216 53400 7.3938
0.7230 53500 7.4147
0.7243 53600 7.3581
0.7257 53700 7.3999
0.7270 53800 7.3785
0.7284 53900 7.3314
0.7297 54000 7.4211
0.7311 54100 7.4208
0.7325 54200 7.4104
0.7338 54300 7.4082
0.7352 54400 7.4121
0.7365 54500 7.4129
0.7379 54600 7.4161
0.7392 54700 7.3773
0.7406 54800 7.4232
0.7419 54900 7.3988
0.7433 55000 7.4255
0.7446 55100 7.4344
0.7460 55200 7.4205
0.7473 55300 7.3482
0.7487 55400 7.4065
0.7500 55500 7.3958
0.7514 55600 7.4531
0.7527 55700 7.3885
0.7541 55800 7.3703
0.7554 55900 7.3876
0.7568 56000 7.3834
0.7581 56100 7.4329
0.7595 56200 7.4173
0.7608 56300 7.4136
0.7622 56400 7.3266
0.7635 56500 7.4004
0.7649 56600 7.4083
0.7662 56700 7.4044
0.7676 56800 7.5311
0.7689 56900 7.4179
0.7703 57000 7.3554
0.7716 57100 7.3405
0.7730 57200 7.3842
0.7743 57300 7.4066
0.7757 57400 7.3378
0.7770 57500 7.3974
0.7784 57600 7.4396
0.7798 57700 7.4185
0.7811 57800 7.4287
0.7825 57900 7.3825
0.7838 58000 7.4411
0.7852 58100 7.3709
0.7865 58200 7.4686
0.7879 58300 7.4221
0.7892 58400 7.3956
0.7906 58500 7.4353
0.7919 58600 7.4169
0.7933 58700 7.4381
0.7946 58800 7.3544
0.7960 58900 7.422
0.7973 59000 7.3677
0.7987 59100 7.3846
0.8000 59200 7.4081
0.8014 59300 7.3805
0.8027 59400 7.3444
0.8041 59500 7.3599
0.8054 59600 7.4212
0.8068 59700 7.3832
0.8081 59800 7.4389
0.8095 59900 7.3765
0.8108 60000 7.3486
0.8122 60100 7.4354
0.8135 60200 7.3653
0.8149 60300 7.3945
0.8162 60400 7.3774
0.8176 60500 7.4478
0.8189 60600 7.3861
0.8203 60700 7.4354
0.8216 60800 7.4178
0.8230 60900 7.4308
0.8243 61000 7.3851
0.8257 61100 7.3675
0.8270 61200 7.4111
0.8284 61300 7.3623
0.8298 61400 7.4197
0.8311 61500 7.3637
0.8325 61600 7.3418
0.8338 61700 7.3717
0.8352 61800 7.324
0.8365 61900 7.4535
0.8379 62000 7.3742
0.8392 62100 7.4178
0.8406 62200 7.3333
0.8419 62300 7.3667
0.8433 62400 7.3958
0.8446 62500 7.3854
0.8460 62600 7.4761
0.8473 62700 7.3554
0.8487 62800 7.4181
0.8500 62900 7.4087
0.8514 63000 7.4377
0.8527 63100 7.3917
0.8541 63200 7.3472
0.8554 63300 7.4259
0.8568 63400 7.4258
0.8581 63500 7.408
0.8595 63600 7.3158
0.8608 63700 7.4015
0.8622 63800 7.3501
0.8635 63900 7.4688
0.8649 64000 7.4317
0.8662 64100 7.3871
0.8676 64200 7.4343
0.8689 64300 7.3576
0.8703 64400 7.3791
0.8716 64500 7.414
0.8730 64600 7.4799
0.8743 64700 7.3808
0.8757 64800 7.3938
0.8771 64900 7.4081
0.8784 65000 7.3366
0.8798 65100 7.4195
0.8811 65200 7.4037
0.8825 65300 7.4083
0.8838 65400 7.3851
0.8852 65500 7.3742
0.8865 65600 7.3499
0.8879 65700 7.4107
0.8892 65800 7.3621
0.8906 65900 7.4014
0.8919 66000 7.3203
0.8933 66100 7.3508
0.8946 66200 7.4091
0.8960 66300 7.334
0.8973 66400 7.4821
0.8987 66500 7.3774
0.9000 66600 7.3702
0.9014 66700 7.4023
0.9027 66800 7.3818
0.9041 66900 7.3627
0.9054 67000 7.4259
0.9068 67100 7.4325
0.9081 67200 7.4175
0.9095 67300 7.3796
0.9108 67400 7.3441
0.9122 67500 7.4127
0.9135 67600 7.3657
0.9149 67700 7.4521
0.9162 67800 7.334
0.9176 67900 7.3521
0.9189 68000 7.5717
0.9203 68100 7.3565
0.9216 68200 7.3432
0.9230 68300 7.3946
0.9243 68400 7.4314
0.9257 68500 7.4264
0.9271 68600 7.5982
0.9284 68700 7.3387
0.9298 68800 7.3901
0.9311 68900 7.4055
0.9325 69000 7.3495
0.9338 69100 7.353
0.9352 69200 7.3658
0.9365 69300 7.4457
0.9379 69400 7.3064
0.9392 69500 7.436
0.9406 69600 7.3606
0.9419 69700 7.4151
0.9433 69800 7.3388
0.9446 69900 7.3324
0.9460 70000 7.37
0.9473 70100 7.3394
0.9487 70200 7.3932
0.9500 70300 7.38
0.9514 70400 7.5234
0.9527 70500 7.3643
0.9541 70600 7.3679
0.9554 70700 7.3575
0.9568 70800 7.3386
0.9581 70900 7.4374
0.9595 71000 7.4561
0.9608 71100 7.4278
0.9622 71200 7.4141
0.9635 71300 7.3704
0.9649 71400 7.3655
0.9662 71500 7.3852
0.9676 71600 7.3723
0.9689 71700 7.3585
0.9703 71800 7.3853
0.9716 71900 7.3615
0.9730 72000 7.3206
0.9744 72100 7.4085
0.9757 72200 7.3448
0.9771 72300 7.3215
0.9784 72400 7.3472
0.9798 72500 7.35
0.9811 72600 7.3183
0.9825 72700 7.3448
0.9838 72800 7.3554
0.9852 72900 7.3596
0.9865 73000 7.4219
0.9879 73100 7.3713
0.9892 73200 7.3795
0.9906 73300 7.3415
0.9919 73400 7.3097
0.9933 73500 7.3453
0.9946 73600 7.4293
0.9960 73700 7.4194
0.9973 73800 7.4188
0.9987 73900 7.3359

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
-
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v5-pair_score

Finetuned
(744)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v5-pair_score

Paper for KhaledReda/all-MiniLM-L6-v5-pair_score