all-MiniLM-L6-v7-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_three_scores_v7_balanced_calibrated dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'chocolate cookie',
    'women bathing cover',
    'sports wristbands',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.4979, 0.4782],
#         [0.4979, 1.0000, 0.7929],
#         [0.4782, 0.7929, 1.0000]])

Training Details

Training Dataset

pairs_three_scores_v7_balanced_calibrated

  • Dataset: pairs_three_scores_v7_balanced_calibrated at 9c7e882
  • Size: 8,826,496 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.71 tokens
    • max: 45 tokens
    • min: 3 tokens
    • mean: 5.78 tokens
    • max: 45 tokens
    • min: 0.0
    • mean: 0.52
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    lamb dog food frizz control shampoo 0.0
    high fiber muesli cloth clutch 0.04
    smoked sausage ready to cook kofta 0.43
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_scores_v7_balanced_calibrated

  • Dataset: pairs_three_scores_v7_balanced_calibrated at 9c7e882
  • Size: 44,355 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.59 tokens
    • max: 44 tokens
    • min: 3 tokens
    • mean: 5.7 tokens
    • max: 31 tokens
    • min: 0.0
    • mean: 0.51
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    sandwich fries meal grilled chicken souvlaki wrap 1.0
    deli juice glass 0.05
    classic coffee shake basic dress 0.05
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0015 100 13.3048
0.0029 200 13.1862
0.0044 300 12.8397
0.0058 400 12.0152
0.0073 500 11.4185
0.0087 600 10.7677
0.0102 700 9.8943
0.0116 800 9.4991
0.0131 900 9.1097
0.0145 1000 8.7566
0.0160 1100 8.5687
0.0174 1200 8.4785
0.0189 1300 8.4378
0.0203 1400 8.3864
0.0218 1500 8.3545
0.0232 1600 8.3495
0.0247 1700 8.3226
0.0261 1800 8.3071
0.0276 1900 8.2657
0.0290 2000 8.2359
0.0305 2100 8.2483
0.0319 2200 8.2237
0.0334 2300 8.1927
0.0348 2400 8.1974
0.0363 2500 8.1804
0.0377 2600 8.1663
0.0392 2700 8.1552
0.0406 2800 8.1536
0.0421 2900 8.132
0.0435 3000 8.1287
0.0450 3100 8.1068
0.0464 3200 8.0878
0.0479 3300 8.0974
0.0493 3400 8.085
0.0508 3500 8.0707
0.0522 3600 8.0684
0.0537 3700 8.0549
0.0551 3800 8.0311
0.0566 3900 8.0207
0.0580 4000 8.0163
0.0595 4100 8.0261
0.0609 4200 8.0119
0.0624 4300 7.9935
0.0638 4400 7.9862
0.0653 4500 7.9841
0.0667 4600 7.973
0.0682 4700 7.9396
0.0696 4800 7.9148
0.0711 4900 7.9454
0.0725 5000 7.9606
0.0740 5100 7.9254
0.0754 5200 7.898
0.0769 5300 7.8979
0.0783 5400 7.8754
0.0798 5500 7.9112
0.0812 5600 7.8664
0.0827 5700 7.8963
0.0841 5800 7.8563
0.0856 5900 7.8445
0.0870 6000 7.857
0.0885 6100 7.8432
0.0899 6200 7.8388
0.0914 6300 7.7983
0.0928 6400 7.8176
0.0943 6500 7.8265
0.0957 6600 7.8054
0.0972 6700 7.8091
0.0986 6800 7.7831
0.1001 6900 7.7593
0.1015 7000 7.8059
0.1030 7100 7.7663
0.1044 7200 7.7572
0.1059 7300 7.7801
0.1073 7400 7.7551
0.1088 7500 7.7223
0.1102 7600 7.758
0.1117 7700 7.7588
0.1131 7800 7.7418
0.1146 7900 7.7174
0.1160 8000 7.6791
0.1175 8100 7.6968
0.1189 8200 7.7013
0.1204 8300 7.6958
0.1218 8400 7.6967
0.1233 8500 7.6606
0.1247 8600 7.6977
0.1262 8700 7.6956
0.1276 8800 7.6707
0.1291 8900 7.6763
0.1305 9000 7.6413
0.1320 9100 7.6582
0.1334 9200 7.641
0.1349 9300 7.6999
0.1363 9400 7.6453
0.1378 9500 7.6351
0.1392 9600 7.64
0.1407 9700 7.6578
0.1421 9800 7.6345
0.1436 9900 7.6143
0.1450 10000 7.6331
0.1465 10100 7.5877
0.1479 10200 7.6273
0.1494 10300 7.5957
0.1508 10400 7.6315
0.1523 10500 7.5612
0.1537 10600 7.6246
0.1552 10700 7.5974
0.1566 10800 7.6075
0.1581 10900 7.5713
0.1595 11000 7.5874
0.1610 11100 7.5549
0.1624 11200 7.5743
0.1639 11300 7.5223
0.1653 11400 7.5496
0.1668 11500 7.5846
0.1682 11600 7.5876
0.1697 11700 7.5735
0.1711 11800 7.5434
0.1726 11900 7.5673
0.1740 12000 7.5181
0.1755 12100 7.5778
0.1769 12200 7.5557
0.1784 12300 7.5137
0.1798 12400 7.546
0.1813 12500 7.5375
0.1827 12600 7.5576
0.1842 12700 7.5075
0.1856 12800 7.5178
0.1871 12900 7.5254
0.1885 13000 7.5349
0.1900 13100 7.5193
0.1914 13200 7.5352
0.1929 13300 7.5146
0.1943 13400 7.5305
0.1958 13500 7.5202
0.1972 13600 7.4845
0.1987 13700 7.4707
0.2001 13800 7.5193
0.2016 13900 7.4866
0.2030 14000 7.4782
0.2045 14100 7.507
0.2059 14200 7.47
0.2074 14300 7.5295
0.2088 14400 7.5049
0.2103 14500 7.4738
0.2117 14600 7.4808
0.2132 14700 7.5044
0.2146 14800 7.4933
0.2161 14900 7.4481
0.2175 15000 7.4323
0.2190 15100 7.4591
0.2204 15200 7.4819
0.2219 15300 7.4255
0.2233 15400 7.4383
0.2248 15500 7.4858
0.2262 15600 7.4464
0.2277 15700 7.4639
0.2291 15800 7.4269
0.2306 15900 7.4219
0.2320 16000 7.4719
0.2335 16100 7.4446
0.2349 16200 7.4658
0.2364 16300 7.4489
0.2378 16400 7.4082
0.2393 16500 7.456
0.2407 16600 7.4113
0.2422 16700 7.4144
0.2436 16800 7.4304
0.2451 16900 7.4192
0.2465 17000 7.4019
0.2480 17100 7.4118
0.2494 17200 7.4348
0.2509 17300 7.4117
0.2523 17400 7.3867
0.2538 17500 7.4006
0.2552 17600 7.4335
0.2567 17700 7.404
0.2581 17800 7.3861
0.2596 17900 7.4109
0.2610 18000 7.417
0.2625 18100 7.3861
0.2639 18200 7.4313
0.2654 18300 7.4198
0.2668 18400 7.4177
0.2683 18500 7.4113
0.2697 18600 7.4214
0.2712 18700 7.3925
0.2726 18800 7.4241
0.2741 18900 7.3627
0.2755 19000 7.3991
0.2770 19100 7.3401
0.2784 19200 7.3592
0.2799 19300 7.3619
0.2813 19400 7.3968
0.2828 19500 7.3659
0.2842 19600 7.3553
0.2857 19700 7.4121
0.2871 19800 7.3858
0.2886 19900 7.4223
0.2900 20000 7.3699
0.2915 20100 7.3452
0.2929 20200 7.3673
0.2944 20300 7.3568
0.2958 20400 7.3753
0.2973 20500 7.375
0.2987 20600 7.3699
0.3002 20700 7.3537
0.3016 20800 7.3739
0.3031 20900 7.3218
0.3045 21000 7.3739
0.3060 21100 7.3781
0.3074 21200 7.3724
0.3089 21300 7.3396
0.3103 21400 7.3074
0.3118 21500 7.3135
0.3132 21600 7.4059
0.3147 21700 7.3887
0.3161 21800 7.2713
0.3176 21900 7.3444
0.3190 22000 7.3375
0.3205 22100 7.3426
0.3219 22200 7.3319
0.3234 22300 7.3323
0.3248 22400 7.3345
0.3263 22500 7.2884
0.3277 22600 7.285
0.3292 22700 7.3242
0.3306 22800 7.3198
0.3321 22900 7.2734
0.3335 23000 7.3444
0.3350 23100 7.3301
0.3364 23200 7.3302
0.3379 23300 7.3194
0.3393 23400 7.3118
0.3408 23500 7.3143
0.3422 23600 7.3177
0.3437 23700 7.3174
0.3451 23800 7.3054
0.3466 23900 7.3186
0.3480 24000 7.3204
0.3495 24100 7.3106
0.3509 24200 7.2729
0.3524 24300 7.2944
0.3538 24400 7.3143
0.3553 24500 7.3126
0.3567 24600 7.3428
0.3582 24700 7.3465
0.3596 24800 7.2775
0.3611 24900 7.3395
0.3625 25000 7.2664
0.3640 25100 7.3029
0.3654 25200 7.2996
0.3669 25300 7.297
0.3683 25400 7.2545
0.3698 25500 7.3155
0.3712 25600 7.3197
0.3727 25700 7.2667
0.3741 25800 7.2578
0.3756 25900 7.2993
0.3770 26000 7.2528
0.3785 26100 7.3429
0.3799 26200 7.3111
0.3814 26300 7.3344
0.3828 26400 7.2869
0.3843 26500 7.2636
0.3857 26600 7.3047
0.3872 26700 7.2681
0.3886 26800 7.297
0.3901 26900 7.2075
0.3915 27000 7.2792
0.3930 27100 7.2952
0.3944 27200 7.3172
0.3959 27300 7.3063
0.3973 27400 7.2786
0.3988 27500 7.2072
0.4002 27600 7.2589
0.4017 27700 7.2975
0.4031 27800 7.2228
0.4046 27900 7.2548
0.4061 28000 7.2572
0.4075 28100 7.2664
0.4090 28200 7.2913
0.4104 28300 7.3064
0.4119 28400 7.2726
0.4133 28500 7.2588
0.4148 28600 7.2715
0.4162 28700 7.275
0.4177 28800 7.2364
0.4191 28900 7.2308
0.4206 29000 7.3067
0.4220 29100 7.2369
0.4235 29200 7.2519
0.4249 29300 7.2358
0.4264 29400 7.244
0.4278 29500 7.2531
0.4293 29600 7.2268
0.4307 29700 7.2785
0.4322 29800 7.2429
0.4336 29900 7.2478
0.4351 30000 7.22
0.4365 30100 7.2125
0.4380 30200 7.2839
0.4394 30300 7.323
0.4409 30400 7.269
0.4423 30500 7.2373
0.4438 30600 7.3115
0.4452 30700 7.2345
0.4467 30800 7.2022
0.4481 30900 7.2235
0.4496 31000 7.207
0.4510 31100 7.2684
0.4525 31200 7.2309
0.4539 31300 7.2201
0.4554 31400 7.2645
0.4568 31500 7.2171
0.4583 31600 7.1958
0.4597 31700 7.2728
0.4612 31800 7.2175
0.4626 31900 7.1995
0.4641 32000 7.1677
0.4655 32100 7.2124
0.4670 32200 7.2163
0.4684 32300 7.1728
0.4699 32400 7.1941
0.4713 32500 7.1826
0.4728 32600 7.2542
0.4742 32700 7.1866
0.4757 32800 7.2511
0.4771 32900 7.2587
0.4786 33000 7.1963
0.4800 33100 7.213
0.4815 33200 7.2072
0.4829 33300 7.3
0.4844 33400 7.1509
0.4858 33500 7.1924
0.4873 33600 7.2083
0.4887 33700 7.2112
0.4902 33800 7.2109
0.4916 33900 7.1624
0.4931 34000 7.1807
0.4945 34100 7.1922
0.4960 34200 7.2215
0.4974 34300 7.2036
0.4989 34400 7.204
0.5003 34500 7.1716
0.5018 34600 7.2463
0.5032 34700 7.2446
0.5047 34800 7.2203
0.5061 34900 7.2153
0.5076 35000 7.153
0.5090 35100 7.2332
0.5105 35200 7.1656
0.5119 35300 7.142
0.5134 35400 7.2018
0.5148 35500 7.177
0.5163 35600 7.1974
0.5177 35700 7.2767
0.5192 35800 7.2503
0.5206 35900 7.1468
0.5221 36000 7.1862
0.5235 36100 7.2403
0.5250 36200 7.2375
0.5264 36300 7.1832
0.5279 36400 7.1365
0.5293 36500 7.2009
0.5308 36600 7.2533
0.5322 36700 7.2221
0.5337 36800 7.2076
0.5351 36900 7.185
0.5366 37000 7.2257
0.5380 37100 7.1446
0.5395 37200 7.2148
0.5409 37300 7.1769
0.5424 37400 7.1636
0.5438 37500 7.2075
0.5453 37600 7.2295
0.5467 37700 7.1181
0.5482 37800 7.1449
0.5496 37900 7.1355
0.5511 38000 7.1549
0.5525 38100 7.1616
0.5540 38200 7.1747
0.5554 38300 7.1871
0.5569 38400 7.1817
0.5583 38500 7.1495
0.5598 38600 7.2552
0.5612 38700 7.1916
0.5627 38800 7.226
0.5641 38900 7.1136
0.5656 39000 7.1555
0.5670 39100 7.1272
0.5685 39200 7.1998
0.5699 39300 7.1242
0.5714 39400 7.2167
0.5728 39500 7.1848
0.5743 39600 7.178
0.5757 39700 7.1347
0.5772 39800 7.1059
0.5786 39900 7.1004
0.5801 40000 7.1096
0.5815 40100 7.1272
0.5830 40200 7.2099
0.5844 40300 7.158
0.5859 40400 7.0955
0.5873 40500 7.1366
0.5888 40600 7.12
0.5902 40700 7.1694
0.5917 40800 7.1576
0.5931 40900 7.2108
0.5946 41000 7.112
0.5960 41100 7.184
0.5975 41200 7.1528
0.5989 41300 7.1737
0.6004 41400 7.2168
0.6018 41500 7.1676
0.6033 41600 7.17
0.6047 41700 7.077
0.6062 41800 7.1684
0.6076 41900 7.1425
0.6091 42000 7.1449
0.6105 42100 7.1325
0.6120 42200 7.195
0.6134 42300 7.159
0.6149 42400 7.179
0.6163 42500 7.1657
0.6178 42600 7.19
0.6192 42700 7.0956
0.6207 42800 7.0952
0.6221 42900 7.1905
0.6236 43000 7.1203
0.6250 43100 7.1293
0.6265 43200 7.1273
0.6279 43300 7.1374
0.6294 43400 7.1003
0.6308 43500 7.1644
0.6323 43600 7.1432
0.6337 43700 7.1237
0.6352 43800 7.1663
0.6366 43900 7.1404
0.6381 44000 7.1664
0.6395 44100 7.1906
0.6410 44200 7.1707
0.6424 44300 7.0727
0.6439 44400 7.1768
0.6453 44500 7.117
0.6468 44600 7.0683
0.6482 44700 7.0952
0.6497 44800 7.1768
0.6511 44900 7.1327
0.6526 45000 7.0887
0.6540 45100 7.1047
0.6555 45200 7.1983
0.6569 45300 7.0902
0.6584 45400 7.1647
0.6598 45500 7.1453
0.6613 45600 7.1043
0.6627 45700 7.0729
0.6642 45800 7.1508
0.6656 45900 7.1688
0.6671 46000 7.1259
0.6685 46100 7.1323
0.6700 46200 7.0916
0.6714 46300 7.1297
0.6729 46400 7.1138
0.6743 46500 7.129
0.6758 46600 7.1228
0.6772 46700 7.0995
0.6787 46800 7.1034
0.6801 46900 7.1085
0.6816 47000 7.1223
0.6830 47100 7.0934
0.6845 47200 7.0865
0.6859 47300 7.076
0.6874 47400 7.1252
0.6888 47500 7.1788
0.6903 47600 7.0807
0.6917 47700 7.1163
0.6932 47800 7.0631
0.6946 47900 7.1842
0.6961 48000 7.1084
0.6975 48100 7.0994
0.6990 48200 7.1163
0.7004 48300 7.0563
0.7019 48400 7.1086
0.7033 48500 7.1233
0.7048 48600 7.0724
0.7062 48700 7.095
0.7077 48800 7.1039
0.7091 48900 7.1138
0.7106 49000 7.1184
0.7120 49100 7.0557
0.7135 49200 7.1065
0.7149 49300 7.1455
0.7164 49400 7.0808
0.7178 49500 7.1734
0.7193 49600 7.1396
0.7207 49700 7.1184
0.7222 49800 7.0714
0.7236 49900 7.1617
0.7251 50000 7.1244
0.7265 50100 7.0367
0.7280 50200 7.1068
0.7294 50300 7.1152
0.7309 50400 7.0782
0.7323 50500 7.0937
0.7338 50600 7.0899
0.7352 50700 7.0212
0.7367 50800 7.0308
0.7381 50900 7.0298
0.7396 51000 7.094
0.7410 51100 7.1962
0.7425 51200 7.0881
0.7439 51300 7.0484
0.7454 51400 7.0536
0.7468 51500 7.1169
0.7483 51600 7.0975
0.7497 51700 7.1155
0.7512 51800 7.0659
0.7526 51900 7.0567
0.7541 52000 7.0942
0.7555 52100 7.1224
0.7570 52200 7.1349
0.7584 52300 7.0874
0.7599 52400 7.0636
0.7613 52500 7.1171
0.7628 52600 7.1359
0.7642 52700 7.1599
0.7657 52800 7.0568
0.7671 52900 7.0948
0.7686 53000 7.0982
0.7700 53100 7.0856
0.7715 53200 7.071
0.7729 53300 7.0562
0.7744 53400 7.066
0.7758 53500 7.119
0.7773 53600 7.1492
0.7787 53700 7.0461
0.7802 53800 7.1003
0.7816 53900 7.0984
0.7831 54000 7.0875
0.7845 54100 7.1238
0.7860 54200 7.0896
0.7874 54300 7.0569
0.7889 54400 7.0994
0.7903 54500 7.146
0.7918 54600 7.0819
0.7932 54700 7.1003
0.7947 54800 7.0744
0.7961 54900 7.1047
0.7976 55000 7.1147
0.7990 55100 7.0318
0.8005 55200 7.0548
0.8019 55300 7.1215
0.8034 55400 7.1124
0.8048 55500 7.1096
0.8063 55600 7.0467
0.8077 55700 7.0454
0.8092 55800 7.0824
0.8107 55900 7.075
0.8121 56000 7.0316
0.8136 56100 7.1231
0.8150 56200 7.0449
0.8165 56300 7.1556
0.8179 56400 7.1036
0.8194 56500 6.9778
0.8208 56600 7.0649
0.8223 56700 7.1124
0.8237 56800 7.0294
0.8252 56900 7.1424
0.8266 57000 7.0891
0.8281 57100 7.0899
0.8295 57200 7.1276
0.8310 57300 7.0621
0.8324 57400 7.0253
0.8339 57500 7.0532
0.8353 57600 7.0171
0.8368 57700 7.0731
0.8382 57800 7.0837
0.8397 57900 7.0778
0.8411 58000 7.0478
0.8426 58100 7.0946
0.8440 58200 7.0772
0.8455 58300 7.0556
0.8469 58400 7.0451
0.8484 58500 7.0708
0.8498 58600 7.0881
0.8513 58700 7.0783
0.8527 58800 7.0632
0.8542 58900 7.0175
0.8556 59000 7.0248
0.8571 59100 7.0781
0.8585 59200 7.0403
0.8600 59300 7.1009
0.8614 59400 7.0367
0.8629 59500 7.0728
0.8643 59600 7.0292
0.8658 59700 7.027
0.8672 59800 7.0737
0.8687 59900 7.0514
0.8701 60000 7.033
0.8716 60100 6.9728
0.8730 60200 7.0672
0.8745 60300 7.1248
0.8759 60400 7.1127
0.8774 60500 7.0061
0.8788 60600 7.0643
0.8803 60700 7.0996
0.8817 60800 7.0991
0.8832 60900 7.05
0.8846 61000 7.0198
0.8861 61100 7.0489
0.8875 61200 7.0778
0.8890 61300 7.1017
0.8904 61400 7.0815
0.8919 61500 7.0565
0.8933 61600 7.0187
0.8948 61700 7.0809
0.8962 61800 7.0219
0.8977 61900 7.0343
0.8991 62000 7.1052
0.9006 62100 7.0448
0.9020 62200 7.0598
0.9035 62300 7.0498
0.9049 62400 7.1148
0.9064 62500 7.048
0.9078 62600 7.1137
0.9093 62700 7.0538
0.9107 62800 7.089
0.9122 62900 7.1247
0.9136 63000 6.9895
0.9151 63100 7.0364
0.9165 63200 7.1219
0.9180 63300 7.0399
0.9194 63400 7.0669
0.9209 63500 7.1429
0.9223 63600 7.1196
0.9238 63700 7.0336
0.9252 63800 7.0792
0.9267 63900 7.0935
0.9281 64000 7.1163
0.9296 64100 7.0483
0.9310 64200 7.0632
0.9325 64300 6.9877
0.9339 64400 6.9545
0.9354 64500 7.047
0.9368 64600 7.0755
0.9383 64700 6.9931
0.9397 64800 7.0
0.9412 64900 7.0293
0.9426 65000 7.0471
0.9441 65100 7.0567
0.9455 65200 7.0853
0.9470 65300 7.0801
0.9484 65400 7.0202
0.9499 65500 7.1004
0.9513 65600 7.1051
0.9528 65700 7.0403
0.9542 65800 7.003
0.9557 65900 7.0495
0.9571 66000 6.9828
0.9586 66100 7.0453
0.9600 66200 7.0196
0.9615 66300 7.1254
0.9629 66400 7.0849
0.9644 66500 7.0223
0.9658 66600 7.1217
0.9673 66700 6.9922
0.9687 66800 7.025
0.9702 66900 7.0705
0.9716 67000 7.0794
0.9731 67100 7.0162
0.9745 67200 7.0855
0.9760 67300 7.0078
0.9774 67400 7.0908
0.9789 67500 7.0663
0.9803 67600 7.0175
0.9818 67700 7.0059
0.9832 67800 7.0673
0.9847 67900 7.0495
0.9861 68000 7.005
0.9876 68100 7.0198
0.9890 68200 7.0515
0.9905 68300 7.0274
0.9919 68400 7.1286
0.9934 68500 7.0913
0.9948 68600 7.0087
0.9963 68700 7.0341
0.9977 68800 7.046
0.9992 68900 7.057

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
1
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v7-pair_score

Finetuned
(743)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v7-pair_score

Paper for KhaledReda/all-MiniLM-L6-v7-pair_score