all-MiniLM-L6-v45-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_with_scores_v38 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'outdoor sports smart watch',
    'wigglers category tags wigglers organic compost fertilizers wigglers keywords wigglers attrs brand growpro generic name wigglers weight 250 gr color description revolutionize your egyptian garden with the power of wigglers these mighty earthworms are nature s ultimate soil warriors tirelessly working to aerate fertilize and enrich your soil. specially selected for egypt s climate our wigglers are the perfect companions for organic gardening enthusiasts. whether you re a seasoned gardener or just starting out these hardworking worms will enhance soil fertility promote healthier plants and increase yields. elevate your gardening experience and harness the incredible benefits of wigglers for a greener more vibrant garden in egypt. get your wigglers today and watch your garden thrive',
    'turkish category tags turkish turkish turkish turkish turkish keywords turkish turkish turkish turkish turkish',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000, -0.1168, -0.1112],
#         [-0.1168,  1.0000,  0.1334],
#         [-0.1112,  0.1334,  1.0000]])

Training Details

Training Dataset

pairs_with_scores_v38

  • Dataset: pairs_with_scores_v38 at 5e4a543
  • Size: 10,827,162 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 4.84 tokens
    • max: 18 tokens
    • min: 6 tokens
    • mean: 56.82 tokens
    • max: 256 tokens
    • min: 0.0
    • mean: 0.05
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    creamy sauce chicken 2 solution category hair loss hair loss tags hair regrowth spray spray keywords hair regrowth spray spray 0.0
    calamyl moisturizing crem round gold chain category tags metal chain round keywords chain round attrs gender women brand trio generic name features round chain types of styles casual material metal color gold 0.0
    shank coconut scent category tags customizable personalized coconut scent peony keywords coconut scent peony description 8.5 diameter color and scent customized upon request. 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_with_scores_v38

  • Dataset: pairs_with_scores_v38 at 5e4a543
  • Size: 54,408 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 4.92 tokens
    • max: 19 tokens
    • min: 6 tokens
    • mean: 58.2 tokens
    • max: 235 tokens
    • min: 0.0
    • mean: 0.04
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    multicolor leather in black category tags keywords attrs gender women brand stache generic name size s material leather color black 0.0
    alkapress valley - 26 cm category tags valley keywords valley description valley s unique use of geometric patterns pays homage to tunisia s rich heritage. tu-va 6100126 0.0
    phone accessory savannah category tags savannah keywords savannah attrs gender women brand minimal generic name product name savannah size one size types of styles casual bohemian everyday wear neckline v-neck sleeve style short sleeve fit loose fit season spring summer 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0012 100 3.0011
0.0024 200 3.1188
0.0035 300 2.702
0.0047 400 2.1866
0.0059 500 1.8666
0.0071 600 2.0708
0.0083 700 1.912
0.0095 800 1.8157
0.0106 900 1.456
0.0118 1000 1.1928
0.0130 1100 1.5215
0.0142 1200 1.4088
0.0154 1300 1.2589
0.0166 1400 1.2721
0.0177 1500 1.2735
0.0189 1600 1.2677
0.0201 1700 1.0044
0.0213 1800 1.1296
0.0225 1900 1.0031
0.0236 2000 0.8479
0.0248 2100 0.9297
0.0260 2200 0.9659
0.0272 2300 0.8802
0.0284 2400 0.7423
0.0296 2500 0.7238
0.0307 2600 0.7042
0.0319 2700 0.8845
0.0331 2800 0.7968
0.0343 2900 0.7393
0.0355 3000 0.6384
0.0366 3100 0.9247
0.0378 3200 0.6522
0.0390 3300 0.6111
0.0402 3400 0.7083
0.0414 3500 0.6053
0.0426 3600 0.7644
0.0437 3700 0.6411
0.0449 3800 0.5584
0.0461 3900 0.5369
0.0473 4000 0.6116
0.0485 4100 0.463
0.0497 4200 0.5766
0.0508 4300 0.547
0.0520 4400 0.5447
0.0532 4500 0.5169
0.0544 4600 0.4696
0.0556 4700 0.6726
0.0567 4800 0.554
0.0579 4900 0.5802
0.0591 5000 0.6521
0.0603 5100 0.54
0.0615 5200 0.6098
0.0627 5300 0.5981
0.0638 5400 0.7111
0.0650 5500 0.5905
0.0662 5600 0.5065
0.0674 5700 0.6835
0.0686 5800 0.7086
0.0697 5900 0.5259
0.0709 6000 0.6127
0.0721 6100 0.507
0.0733 6200 0.6596
0.0745 6300 0.4578
0.0757 6400 0.5081
0.0768 6500 0.5813
0.0780 6600 0.5689
0.0792 6700 0.5149
0.0804 6800 0.8857
0.0816 6900 0.6218
0.0828 7000 0.3192
0.0839 7100 0.5446
0.0851 7200 0.4598
0.0863 7300 0.5047
0.0875 7400 0.3884
0.0887 7500 0.447
0.0898 7600 0.4975
0.0910 7700 0.4983
0.0922 7800 0.6884
0.0934 7900 0.7074
0.0946 8000 0.6115
0.0958 8100 0.4229
0.0969 8200 0.5483
0.0981 8300 0.4274
0.0993 8400 0.2982
0.1005 8500 0.4978
0.1017 8600 0.4267
0.1029 8700 0.5728
0.1040 8800 0.524
0.1052 8900 0.4712
0.1064 9000 0.523
0.1076 9100 0.4475
0.1088 9200 0.46
0.1099 9300 0.5011
0.1111 9400 0.4889
0.1123 9500 0.3932
0.1135 9600 0.5747
0.1147 9700 0.3885
0.1159 9800 0.5188
0.1170 9900 0.4613
0.1182 10000 0.417
0.1194 10100 0.5523
0.1206 10200 0.4751
0.1218 10300 0.4184
0.1229 10400 0.4386
0.1241 10500 0.4688
0.1253 10600 0.4059
0.1265 10700 0.5529
0.1277 10800 0.6302
0.1289 10900 0.566
0.1300 11000 0.4066
0.1312 11100 0.3974
0.1324 11200 0.4391
0.1336 11300 0.2897
0.1348 11400 0.5067
0.1360 11500 0.5282
0.1371 11600 0.4356
0.1383 11700 0.396
0.1395 11800 0.4778
0.1407 11900 0.1936
0.1419 12000 0.3093
0.1430 12100 0.5292
0.1442 12200 0.3002
0.1454 12300 0.5085
0.1466 12400 0.4501
0.1478 12500 0.2488
0.1490 12600 0.3276
0.1501 12700 0.4537
0.1513 12800 0.5538
0.1525 12900 0.3487
0.1537 13000 0.3601
0.1549 13100 0.2983
0.1561 13200 0.4569
0.1572 13300 0.2914
0.1584 13400 0.5063
0.1596 13500 0.4396
0.1608 13600 0.3497
0.1620 13700 0.4536
0.1631 13800 0.5392
0.1643 13900 0.3739
0.1655 14000 0.3998
0.1667 14100 0.4318
0.1679 14200 0.3659
0.1691 14300 0.4774
0.1702 14400 0.3459
0.1714 14500 0.3431
0.1726 14600 0.4075
0.1738 14700 0.4312
0.1750 14800 0.4297
0.1761 14900 0.4807
0.1773 15000 0.3426
0.1785 15100 0.3523
0.1797 15200 0.472
0.1809 15300 0.434
0.1821 15400 0.528
0.1832 15500 0.4272
0.1844 15600 0.4504
0.1856 15700 0.4886
0.1868 15800 0.3516
0.1880 15900 0.3879
0.1892 16000 0.2654
0.1903 16100 0.3661
0.1915 16200 0.3489
0.1927 16300 0.3325
0.1939 16400 0.3594
0.1951 16500 0.3784
0.1962 16600 0.5795
0.1974 16700 0.2197
0.1986 16800 0.3513
0.1998 16900 0.4946
0.2010 17000 0.2278
0.2022 17100 0.3707
0.2033 17200 0.5117
0.2045 17300 0.5504
0.2057 17400 0.4665
0.2069 17500 0.4916
0.2081 17600 0.4502
0.2092 17700 0.4014
0.2104 17800 0.2362
0.2116 17900 0.3995
0.2128 18000 0.372
0.2140 18100 0.3913
0.2152 18200 0.2522
0.2163 18300 0.2747
0.2175 18400 0.3137
0.2187 18500 0.2479
0.2199 18600 0.3771
0.2211 18700 0.2902
0.2223 18800 0.2752
0.2234 18900 0.3635
0.2246 19000 0.4402
0.2258 19100 0.3342
0.2270 19200 0.3303
0.2282 19300 0.3801
0.2293 19400 0.4835
0.2305 19500 0.4257
0.2317 19600 0.2798
0.2329 19700 0.321
0.2341 19800 0.2779
0.2353 19900 0.3611
0.2364 20000 0.3353
0.2376 20100 0.2631
0.2388 20200 0.2941
0.2400 20300 0.2123
0.2412 20400 0.2888
0.2424 20500 0.2814
0.2435 20600 0.4901
0.2447 20700 0.2923
0.2459 20800 0.3386
0.2471 20900 0.2815
0.2483 21000 0.3529
0.2494 21100 0.2674
0.2506 21200 0.3017
0.2518 21300 0.2603
0.2530 21400 0.2724
0.2542 21500 0.2933
0.2554 21600 0.3194
0.2565 21700 0.3594
0.2577 21800 0.2353
0.2589 21900 0.1686
0.2601 22000 0.5729
0.2613 22100 0.2403
0.2624 22200 0.477
0.2636 22300 0.2741
0.2648 22400 0.3584
0.2660 22500 0.34
0.2672 22600 0.142
0.2684 22700 0.5215
0.2695 22800 0.3599
0.2707 22900 0.392
0.2719 23000 0.21
0.2731 23100 0.3272
0.2743 23200 0.1939
0.2755 23300 0.2996
0.2766 23400 0.4124
0.2778 23500 0.2191
0.2790 23600 0.4189
0.2802 23700 0.3127
0.2814 23800 0.2836
0.2825 23900 0.4377
0.2837 24000 0.2477
0.2849 24100 0.1823
0.2861 24200 0.3345
0.2873 24300 0.1592
0.2885 24400 0.2826
0.2896 24500 0.2746
0.2908 24600 0.3893
0.2920 24700 0.2831
0.2932 24800 0.2058
0.2944 24900 0.3126
0.2956 25000 0.4332
0.2967 25100 0.3453
0.2979 25200 0.2358
0.2991 25300 0.3359
0.3003 25400 0.2262
0.3015 25500 0.2517
0.3026 25600 0.3285
0.3038 25700 0.2966
0.3050 25800 0.2261
0.3062 25900 0.2621
0.3074 26000 0.447
0.3086 26100 0.1679
0.3097 26200 0.2305
0.3109 26300 0.3093
0.3121 26400 0.3461
0.3133 26500 0.1874
0.3145 26600 0.2953
0.3156 26700 0.2159
0.3168 26800 0.3724
0.3180 26900 0.3694
0.3192 27000 0.2765
0.3204 27100 0.3718
0.3216 27200 0.2275
0.3227 27300 0.3712
0.3239 27400 0.3034
0.3251 27500 0.2761
0.3263 27600 0.3387
0.3275 27700 0.2386
0.3287 27800 0.2951
0.3298 27900 0.2373
0.3310 28000 0.3166
0.3322 28100 0.1861
0.3334 28200 0.3138
0.3346 28300 0.2721
0.3357 28400 0.2558
0.3369 28500 0.4647
0.3381 28600 0.3267
0.3393 28700 0.186
0.3405 28800 0.2337
0.3417 28900 0.2898
0.3428 29000 0.3932
0.3440 29100 0.3455
0.3452 29200 0.3812
0.3464 29300 0.2193
0.3476 29400 0.4008
0.3487 29500 0.5107
0.3499 29600 0.2428
0.3511 29700 0.183
0.3523 29800 0.2731
0.3535 29900 0.4061
0.3547 30000 0.2645
0.3558 30100 0.1469
0.3570 30200 0.1441
0.3582 30300 0.4598
0.3594 30400 0.3069
0.3606 30500 0.2
0.3618 30600 0.318
0.3629 30700 0.483
0.3641 30800 0.3724
0.3653 30900 0.4162
0.3665 31000 0.2893
0.3677 31100 0.1603
0.3688 31200 0.2551
0.3700 31300 0.2852
0.3712 31400 0.2831
0.3724 31500 0.241
0.3736 31600 0.4407
0.3748 31700 0.3294
0.3759 31800 0.2951
0.3771 31900 0.4059
0.3783 32000 0.2387
0.3795 32100 0.2159
0.3807 32200 0.196
0.3819 32300 0.3167
0.3830 32400 0.31
0.3842 32500 0.2916
0.3854 32600 0.4266
0.3866 32700 0.3467
0.3878 32800 0.264
0.3889 32900 0.3363
0.3901 33000 0.2973
0.3913 33100 0.2986
0.3925 33200 0.1551
0.3937 33300 0.3737
0.3949 33400 0.2519
0.3960 33500 0.1205
0.3972 33600 0.3429
0.3984 33700 0.3489
0.3996 33800 0.267
0.4008 33900 0.2295
0.4019 34000 0.498
0.4031 34100 0.2181
0.4043 34200 0.3637
0.4055 34300 0.2737
0.4067 34400 0.237
0.4079 34500 0.4576
0.4090 34600 0.2861
0.4102 34700 0.2962
0.4114 34800 0.3223
0.4126 34900 0.1787
0.4138 35000 0.1651
0.4150 35100 0.218
0.4161 35200 0.3494
0.4173 35300 0.254
0.4185 35400 0.2484
0.4197 35500 0.2111
0.4209 35600 0.2182
0.4220 35700 0.2826
0.4232 35800 0.3862
0.4244 35900 0.1778
0.4256 36000 0.2026
0.4268 36100 0.4062
0.4280 36200 0.3377
0.4291 36300 0.2501
0.4303 36400 0.1971
0.4315 36500 0.1965
0.4327 36600 0.2112
0.4339 36700 0.2779
0.4350 36800 0.2446
0.4362 36900 0.1647
0.4374 37000 0.372
0.4386 37100 0.2256
0.4398 37200 0.4294
0.4410 37300 0.2305
0.4421 37400 0.2338
0.4433 37500 0.1425
0.4445 37600 0.27
0.4457 37700 0.3507
0.4469 37800 0.3602
0.4481 37900 0.208
0.4492 38000 0.5506
0.4504 38100 0.3372
0.4516 38200 0.2552
0.4528 38300 0.2641
0.4540 38400 0.2489
0.4551 38500 0.2349
0.4563 38600 0.4253
0.4575 38700 0.1115
0.4587 38800 0.2437
0.4599 38900 0.2427
0.4611 39000 0.2807
0.4622 39100 0.1583
0.4634 39200 0.3696
0.4646 39300 0.2216
0.4658 39400 0.2269
0.4670 39500 0.2348
0.4682 39600 0.1744
0.4693 39700 0.4657
0.4705 39800 0.2744
0.4717 39900 0.2182
0.4729 40000 0.3068
0.4741 40100 0.2823
0.4752 40200 0.2098
0.4764 40300 0.14
0.4776 40400 0.2795
0.4788 40500 0.2503
0.4800 40600 0.2208
0.4812 40700 0.281
0.4823 40800 0.322
0.4835 40900 0.1852
0.4847 41000 0.2546
0.4859 41100 0.2212
0.4871 41200 0.2098
0.4882 41300 0.1822
0.4894 41400 0.1093
0.4906 41500 0.2268
0.4918 41600 0.2133
0.4930 41700 0.254
0.4942 41800 0.184
0.4953 41900 0.1988
0.4965 42000 0.1867
0.4977 42100 0.1983
0.4989 42200 0.4042
0.5001 42300 0.4507
0.5013 42400 0.2423
0.5024 42500 0.1334
0.5036 42600 0.2268
0.5048 42700 0.3753
0.5060 42800 0.2982
0.5072 42900 0.2205
0.5083 43000 0.2951
0.5095 43100 0.2851
0.5107 43200 0.1809
0.5119 43300 0.3159
0.5131 43400 0.382
0.5143 43500 0.2359
0.5154 43600 0.1513
0.5166 43700 0.1668
0.5178 43800 0.2322
0.5190 43900 0.2437
0.5202 44000 0.2542
0.5214 44100 0.2593
0.5225 44200 0.1744
0.5237 44300 0.2984
0.5249 44400 0.1511
0.5261 44500 0.2455
0.5273 44600 0.1389
0.5284 44700 0.3415
0.5296 44800 0.3254
0.5308 44900 0.2966
0.5320 45000 0.2549
0.5332 45100 0.2043
0.5344 45200 0.3562
0.5355 45300 0.325
0.5367 45400 0.3129
0.5379 45500 0.2292
0.5391 45600 0.4046
0.5403 45700 0.2194
0.5414 45800 0.2
0.5426 45900 0.148
0.5438 46000 0.2381
0.5450 46100 0.2145
0.5462 46200 0.2321
0.5474 46300 0.3046
0.5485 46400 0.2048
0.5497 46500 0.1872
0.5509 46600 0.1826
0.5521 46700 0.2892
0.5533 46800 0.2199
0.5545 46900 0.1905
0.5556 47000 0.1966
0.5568 47100 0.281
0.5580 47200 0.3232
0.5592 47300 0.302
0.5604 47400 0.1269
0.5615 47500 0.0806
0.5627 47600 0.3116
0.5639 47700 0.1909
0.5651 47800 0.11
0.5663 47900 0.301
0.5675 48000 0.1455
0.5686 48100 0.2442
0.5698 48200 0.1955
0.5710 48300 0.3293
0.5722 48400 0.1447
0.5734 48500 0.3026
0.5745 48600 0.2611
0.5757 48700 0.1316
0.5769 48800 0.3313
0.5781 48900 0.2435
0.5793 49000 0.2051
0.5805 49100 0.281
0.5816 49200 0.4651
0.5828 49300 0.2014
0.5840 49400 0.2743
0.5852 49500 0.1366
0.5864 49600 0.1223
0.5876 49700 0.218
0.5887 49800 0.2868
0.5899 49900 0.1618
0.5911 50000 0.2116
0.5923 50100 0.2716
0.5935 50200 0.1612
0.5946 50300 0.1549
0.5958 50400 0.218
0.5970 50500 0.106
0.5982 50600 0.3294
0.5994 50700 0.1903
0.6006 50800 0.3658
0.6017 50900 0.2213
0.6029 51000 0.1229
0.6041 51100 0.289
0.6053 51200 0.3822
0.6065 51300 0.2053
0.6077 51400 0.1599
0.6088 51500 0.1261
0.6100 51600 0.1338
0.6112 51700 0.1417
0.6124 51800 0.1983
0.6136 51900 0.3362
0.6147 52000 0.1495
0.6159 52100 0.1899
0.6171 52200 0.246
0.6183 52300 0.319
0.6195 52400 0.2951
0.6207 52500 0.1243
0.6218 52600 0.2563
0.6230 52700 0.191
0.6242 52800 0.3106
0.6254 52900 0.1707
0.6266 53000 0.2719
0.6277 53100 0.1056
0.6289 53200 0.2603
0.6301 53300 0.2132
0.6313 53400 0.2362
0.6325 53500 0.1517
0.6337 53600 0.1552
0.6348 53700 0.1852
0.6360 53800 0.2365
0.6372 53900 0.1801
0.6384 54000 0.2784
0.6396 54100 0.3607
0.6408 54200 0.2457
0.6419 54300 0.2772
0.6431 54400 0.2288
0.6443 54500 0.3408
0.6455 54600 0.2904
0.6467 54700 0.1665
0.6478 54800 0.1763
0.6490 54900 0.1696
0.6502 55000 0.2586
0.6514 55100 0.1744
0.6526 55200 0.185
0.6538 55300 0.2609
0.6549 55400 0.3522
0.6561 55500 0.1749
0.6573 55600 0.1988
0.6585 55700 0.1897
0.6597 55800 0.2397
0.6609 55900 0.1677
0.6620 56000 0.4117
0.6632 56100 0.2489
0.6644 56200 0.1876
0.6656 56300 0.1634
0.6668 56400 0.2486
0.6679 56500 0.3521
0.6691 56600 0.1474
0.6703 56700 0.1088
0.6715 56800 0.3521
0.6727 56900 0.2327
0.6739 57000 0.1259
0.6750 57100 0.2464
0.6762 57200 0.1698
0.6774 57300 0.292
0.6786 57400 0.1067
0.6798 57500 0.1909
0.6809 57600 0.1196
0.6821 57700 0.3523
0.6833 57800 0.1899
0.6845 57900 0.3315
0.6857 58000 0.36
0.6869 58100 0.1868
0.6880 58200 0.0783
0.6892 58300 0.203
0.6904 58400 0.2928
0.6916 58500 0.1506
0.6928 58600 0.244
0.6940 58700 0.2145
0.6951 58800 0.1517
0.6963 58900 0.1806
0.6975 59000 0.349
0.6987 59100 0.1427
0.6999 59200 0.1945
0.7010 59300 0.1912
0.7022 59400 0.1737
0.7034 59500 0.1589
0.7046 59600 0.1207
0.7058 59700 0.219
0.7070 59800 0.1421
0.7081 59900 0.1325
0.7093 60000 0.2163
0.7105 60100 0.1793
0.7117 60200 0.1418
0.7129 60300 0.1328
0.7140 60400 0.1868
0.7152 60500 0.2235
0.7164 60600 0.1998
0.7176 60700 0.3268
0.7188 60800 0.2645
0.7200 60900 0.251
0.7211 61000 0.2678
0.7223 61100 0.2007
0.7235 61200 0.2702
0.7247 61300 0.2013
0.7259 61400 0.2137
0.7271 61500 0.1684
0.7282 61600 0.2005
0.7294 61700 0.1853
0.7306 61800 0.1019
0.7318 61900 0.2857
0.7330 62000 0.2217
0.7341 62100 0.229
0.7353 62200 0.1762
0.7365 62300 0.147
0.7377 62400 0.2027
0.7389 62500 0.1486
0.7401 62600 0.0796
0.7412 62700 0.2477
0.7424 62800 0.169
0.7436 62900 0.2577
0.7448 63000 0.2607
0.7460 63100 0.0863
0.7472 63200 0.1393
0.7483 63300 0.1486
0.7495 63400 0.1343
0.7507 63500 0.1787
0.7519 63600 0.1812
0.7531 63700 0.2089
0.7542 63800 0.2912
0.7554 63900 0.1511
0.7566 64000 0.2044
0.7578 64100 0.3588
0.7590 64200 0.1315
0.7602 64300 0.1951
0.7613 64400 0.1738
0.7625 64500 0.3529
0.7637 64600 0.3025
0.7649 64700 0.1893
0.7661 64800 0.3588
0.7672 64900 0.2323
0.7684 65000 0.2435
0.7696 65100 0.1503
0.7708 65200 0.2285
0.7720 65300 0.1856
0.7732 65400 0.1878
0.7743 65500 0.2569
0.7755 65600 0.1553
0.7767 65700 0.1378
0.7779 65800 0.1398
0.7791 65900 0.2696
0.7803 66000 0.1928
0.7814 66100 0.1289
0.7826 66200 0.1539
0.7838 66300 0.3213
0.7850 66400 0.1173
0.7862 66500 0.1969
0.7873 66600 0.1642
0.7885 66700 0.2291
0.7897 66800 0.2609
0.7909 66900 0.2279
0.7921 67000 0.1902
0.7933 67100 0.1036
0.7944 67200 0.1954
0.7956 67300 0.1375
0.7968 67400 0.2393
0.7980 67500 0.2051
0.7992 67600 0.186
0.8003 67700 0.1368
0.8015 67800 0.1505
0.8027 67900 0.113
0.8039 68000 0.1533
0.8051 68100 0.1741
0.8063 68200 0.1154
0.8074 68300 0.2375
0.8086 68400 0.2681
0.8098 68500 0.2329
0.8110 68600 0.1549
0.8122 68700 0.1407
0.8134 68800 0.1875
0.8145 68900 0.1684
0.8157 69000 0.327
0.8169 69100 0.161
0.8181 69200 0.2245
0.8193 69300 0.0945
0.8204 69400 0.1376
0.8216 69500 0.1955
0.8228 69600 0.1514
0.8240 69700 0.2525
0.8252 69800 0.1432
0.8264 69900 0.1585
0.8275 70000 0.2364
0.8287 70100 0.1341
0.8299 70200 0.0829
0.8311 70300 0.0944
0.8323 70400 0.2952
0.8335 70500 0.3315
0.8346 70600 0.2225
0.8358 70700 0.1873
0.8370 70800 0.164
0.8382 70900 0.1399
0.8394 71000 0.2327
0.8405 71100 0.1962
0.8417 71200 0.2469
0.8429 71300 0.2247
0.8441 71400 0.1333
0.8453 71500 0.1753
0.8465 71600 0.1417
0.8476 71700 0.0978
0.8488 71800 0.1675
0.8500 71900 0.2479
0.8512 72000 0.2256
0.8524 72100 0.1176
0.8535 72200 0.2608
0.8547 72300 0.1199
0.8559 72400 0.2638
0.8571 72500 0.2186
0.8583 72600 0.1652
0.8595 72700 0.1627
0.8606 72800 0.2129
0.8618 72900 0.1206
0.8630 73000 0.1989
0.8642 73100 0.1513
0.8654 73200 0.1579
0.8666 73300 0.2043
0.8677 73400 0.1531
0.8689 73500 0.1164
0.8701 73600 0.1754
0.8713 73700 0.3191
0.8725 73800 0.2397
0.8736 73900 0.0874
0.8748 74000 0.2364
0.8760 74100 0.145
0.8772 74200 0.1775
0.8784 74300 0.1724
0.8796 74400 0.1303
0.8807 74500 0.1858
0.8819 74600 0.2444
0.8831 74700 0.0803
0.8843 74800 0.1852
0.8855 74900 0.1755
0.8867 75000 0.1407
0.8878 75100 0.1773
0.8890 75200 0.2912
0.8902 75300 0.1203
0.8914 75400 0.2338
0.8926 75500 0.1733
0.8937 75600 0.1118
0.8949 75700 0.1831
0.8961 75800 0.4156
0.8973 75900 0.1494
0.8985 76000 0.072
0.8997 76100 0.1744
0.9008 76200 0.1091
0.9020 76300 0.1464
0.9032 76400 0.2111
0.9044 76500 0.0763
0.9056 76600 0.1468
0.9067 76700 0.1483
0.9079 76800 0.1197
0.9091 76900 0.2614
0.9103 77000 0.1826
0.9115 77100 0.2556
0.9127 77200 0.2355
0.9138 77300 0.1621
0.9150 77400 0.1812
0.9162 77500 0.08
0.9174 77600 0.0916
0.9186 77700 0.2734
0.9198 77800 0.1764
0.9209 77900 0.1261
0.9221 78000 0.1266
0.9233 78100 0.2578
0.9245 78200 0.147
0.9257 78300 0.0892
0.9268 78400 0.2436
0.9280 78500 0.3233
0.9292 78600 0.2142
0.9304 78700 0.2231
0.9316 78800 0.1546
0.9328 78900 0.1716
0.9339 79000 0.1514
0.9351 79100 0.064
0.9363 79200 0.1428
0.9375 79300 0.2453
0.9387 79400 0.348
0.9398 79500 0.2739
0.9410 79600 0.2446
0.9422 79700 0.1081
0.9434 79800 0.3078
0.9446 79900 0.1948
0.9458 80000 0.1562
0.9469 80100 0.2414
0.9481 80200 0.1235
0.9493 80300 0.164
0.9505 80400 0.2325
0.9517 80500 0.126
0.9529 80600 0.1135
0.9540 80700 0.1889
0.9552 80800 0.2255
0.9564 80900 0.1105
0.9576 81000 0.1364
0.9588 81100 0.2261
0.9599 81200 0.2825
0.9611 81300 0.1119
0.9623 81400 0.1864
0.9635 81500 0.2259
0.9647 81600 0.1556
0.9659 81700 0.1305
0.9670 81800 0.2178
0.9682 81900 0.1686
0.9694 82000 0.2118
0.9706 82100 0.197
0.9718 82200 0.174
0.9730 82300 0.1995
0.9741 82400 0.1058
0.9753 82500 0.2344
0.9765 82600 0.1284
0.9777 82700 0.0896
0.9789 82800 0.2184
0.9800 82900 0.2321
0.9812 83000 0.1135
0.9824 83100 0.1824
0.9836 83200 0.0695
0.9848 83300 0.1236
0.9860 83400 0.1259
0.9871 83500 0.0869
0.9883 83600 0.1957
0.9895 83700 0.2565
0.9907 83800 0.1731
0.9919 83900 0.1427
0.9930 84000 0.1762
0.9942 84100 0.1108
0.9954 84200 0.1748
0.9966 84300 0.0978
0.9978 84400 0.1419
0.9990 84500 0.1937

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
17
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v45-pair_score

Finetuned
(816)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v45-pair_score

Paper for KhaledReda/all-MiniLM-L6-v45-pair_score