all-MiniLM-L6-v38-pair_score

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the pairs_with_scores_v32 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'classic shoes',
    'forever skin cleansing device, silicone ultrasonic facial cleanser facial electric cleanser ultrasonic face wash brush mini sonic face brush electric face cleanser facial cleansing tool forever skin device skin cleansing device, electric face cleanser facial cleansing tool forever skin device skin cleansing device, forever silicone ultrasonic facial cleanser face wash brush facial electric cleanser all skin type - forever offers all the benefits of deep cleansing in one compact palm-sized device. the t-sonic pulsations deliver the unique ability to remove 99.5 of dirt and oil as well as makeup residue and dead skin cells and exfoliate without irritating the skin. just 1 minute of use twice daily cleanses and transforms the skin by removing blemish-causing impurities. the mini sonic face brush is made from highly durable body-safe hypoallergenic silicone and is non-porous to resist bacteria build-up making it 35 x more hygienic than nylon-bristled brushes and never requiring any replacement brush heads. lightweight completely waterproof for use in the bath or shower and with 2 speed settings the mini is designed around your life with each full charge lasting up to 300 uses. specification type skin cleansing exfoliation. system power source battery. brand forever. package 1 x forever silicone ultrasonic facial cleanser.',
    'polynomial equations calculator',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.4272, 0.5532],
#         [0.4272, 1.0000, 0.5415],
#         [0.5532, 0.5415, 1.0000]])

Training Details

Training Dataset

pairs_with_scores_v32

  • Dataset: pairs_with_scores_v32 at d05ef20
  • Size: 43,059,870 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.09 tokens
    • max: 35 tokens
    • min: 3 tokens
    • mean: 41.16 tokens
    • max: 256 tokens
    • min: 0.0
    • mean: 0.28
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    ovenware linen shirt with button down and stand up collar, men shirt long sleeves shirt men s tops button down shirt linen shirt shirt stand up collar shirt, button down shirt linen shirt shirt stand up collar shirt, gender men mix and match generic shirt s types of fashion styles casual neckline stand up collar closure style button down sleeve style long sleeves fit regular fit linen white solid occasion casual season spring summer, linen shirt with button down long sleeves and stand up collar 0.0
    fries antipastoes tealight candle holder, home and garden home decor home decor accessory home decor accessory, rings organizer coins organizer ceramic powder holder paper holder sand holder candle holder holder home decor tealight candle holder, candle holder holder home decor tealight candle holder, create a cozy atmosphere with this tealight candle holder. not just for candles this compact holder doubles as a convenient organizer for small items like rings coins or office supplies. all our products are made of our own mixture of ceramic powder paper sand and other sustainable materials to ensure its strength and sustainability. weight 120 gm 0.0
    adults bikes hybrid sea salt body exfoliate and polish 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_with_scores_v32

  • Dataset: pairs_with_scores_v32 at d05ef20
  • Size: 216,382 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.12 tokens
    • max: 31 tokens
    • min: 3 tokens
    • mean: 41.67 tokens
    • max: 256 tokens
    • min: 0.0
    • mean: 0.27
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    cheese sauce fries printed cotton scarf with fabric tassels, women scarf voile scarf shawls fabric scarf printed scarf scarf tassels scarf, fabric scarf printed scarf scarf tassels scarf, gender women mix and match generic scarf cotton black printed, printed voile scarf with fabric tassels 0.0
    camel bag camel tank top 0.25
    scrunchie spoiled babe set 0.75
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.8505 286100 6.328
0.8508 286200 6.4337
0.8511 286300 6.3625
0.8514 286400 6.3524
0.8516 286500 6.324
0.8519 286600 6.3453
0.8522 286700 6.4266
0.8525 286800 6.3666
0.8528 286900 6.376
0.8531 287000 6.396
0.8534 287100 6.3725
0.8537 287200 6.3696
0.8540 287300 6.4024
0.8543 287400 6.3841
0.8546 287500 6.3344
0.8549 287600 6.4528
0.8552 287700 6.4161
0.8555 287800 6.3852
0.8558 287900 6.3908
0.8561 288000 6.3747
0.8564 288100 6.3385
0.8567 288200 6.3625
0.8570 288300 6.4054
0.8573 288400 6.3758
0.8576 288500 6.3604
0.8579 288600 6.3866
0.8582 288700 6.4301
0.8585 288800 6.4232
0.8588 288900 6.3781
0.8591 289000 6.4106
0.8594 289100 6.3579
0.8597 289200 6.3691
0.8600 289300 6.4222
0.8603 289400 6.3994
0.8606 289500 6.3615
0.8609 289600 6.406
0.8612 289700 6.3942
0.8615 289800 6.3811
0.8618 289900 6.3702
0.8621 290000 6.3925
0.8624 290100 6.4173
0.8626 290200 6.4267
0.8629 290300 6.3989
0.8632 290400 6.3715
0.8635 290500 6.3582
0.8638 290600 6.3659
0.8641 290700 6.3671
0.8644 290800 6.3837
0.8647 290900 6.4486
0.8650 291000 6.3993
0.8653 291100 6.3985
0.8656 291200 6.3982
0.8659 291300 6.3297
0.8662 291400 6.3726
0.8665 291500 6.3452
0.8668 291600 6.3704
0.8671 291700 6.3002
0.8674 291800 6.4093
0.8677 291900 6.4129
0.8680 292000 6.4081
0.8683 292100 6.4361
0.8686 292200 6.4205
0.8689 292300 6.4255
0.8692 292400 6.4122
0.8695 292500 6.4621
0.8698 292600 6.364
0.8701 292700 6.4073
0.8704 292800 6.3409
0.8707 292900 6.3107
0.8710 293000 6.3727
0.8713 293100 6.3447
0.8716 293200 6.4191
0.8719 293300 6.3492
0.8722 293400 6.3553
0.8725 293500 6.3768
0.8728 293600 6.3378
0.8731 293700 6.3998
0.8733 293800 6.438
0.8736 293900 6.34
0.8739 294000 6.4061
0.8742 294100 6.4552
0.8745 294200 6.2997
0.8748 294300 6.4018
0.8751 294400 6.412
0.8754 294500 6.3756
0.8757 294600 6.3983
0.8760 294700 6.3758
0.8763 294800 6.3707
0.8766 294900 6.3802
0.8769 295000 6.3767
0.8772 295100 6.4037
0.8775 295200 6.3425
0.8778 295300 6.3655
0.8781 295400 6.4575
0.8784 295500 6.4242
0.8787 295600 6.365
0.8790 295700 6.373
0.8793 295800 6.3766
0.8796 295900 6.3835
0.8799 296000 6.4327
0.8802 296100 6.3799
0.8805 296200 6.41
0.8808 296300 6.3092
0.8811 296400 6.4133
0.8814 296500 6.3952
0.8817 296600 6.3937
0.8820 296700 6.3204
0.8823 296800 6.4072
0.8826 296900 6.3577
0.8829 297000 6.3966
0.8832 297100 6.3906
0.8835 297200 6.3871
0.8838 297300 6.3546
0.8841 297400 6.3874
0.8843 297500 6.4042
0.8846 297600 6.3963
0.8849 297700 6.3708
0.8852 297800 6.3269
0.8855 297900 6.3554
0.8858 298000 6.3884
0.8861 298100 6.3645
0.8864 298200 6.4203
0.8867 298300 6.3827
0.8870 298400 6.3947
0.8873 298500 6.3989
0.8876 298600 6.3454
0.8879 298700 6.4956
0.8882 298800 6.3975
0.8885 298900 6.3643
0.8888 299000 6.3606
0.8891 299100 6.4184
0.8894 299200 6.3975
0.8897 299300 6.3836
0.8900 299400 6.3696
0.8903 299500 6.3567
0.8906 299600 6.3142
0.8909 299700 6.3703
0.8912 299800 6.3126
0.8915 299900 6.3847
0.8918 300000 6.3761
0.8921 300100 6.3673
0.8924 300200 6.3426
0.8927 300300 6.4366
0.8930 300400 6.3626
0.8933 300500 6.3549
0.8936 300600 6.3696
0.8939 300700 6.4061
0.8942 300800 6.4622
0.8945 300900 6.3447
0.8948 301000 6.386
0.8950 301100 6.3719
0.8953 301200 6.4033
0.8956 301300 6.3635
0.8959 301400 6.3179
0.8962 301500 6.3273
0.8965 301600 6.4156
0.8968 301700 6.3601
0.8971 301800 6.3754
0.8974 301900 6.4151
0.8977 302000 6.3435
0.8980 302100 6.3745
0.8983 302200 6.3563
0.8986 302300 6.3999
0.8989 302400 6.349
0.8992 302500 6.3886
0.8995 302600 6.387
0.8998 302700 6.3786
0.9001 302800 6.4126
0.9004 302900 6.3439
0.9007 303000 6.3376
0.9010 303100 6.3512
0.9013 303200 6.4281
0.9016 303300 6.3999
0.9019 303400 6.3757
0.9022 303500 6.3297
0.9025 303600 6.4042
0.9028 303700 6.3001
0.9031 303800 6.3028
0.9034 303900 6.3969
0.9037 304000 6.2983
0.9040 304100 6.3043
0.9043 304200 6.4063
0.9046 304300 6.3829
0.9049 304400 6.3786
0.9052 304500 6.4584
0.9055 304600 6.4324
0.9058 304700 6.4425
0.9060 304800 6.3995
0.9063 304900 6.3952
0.9066 305000 6.4232
0.9069 305100 6.3573
0.9072 305200 6.3585
0.9075 305300 6.4424
0.9078 305400 6.2995
0.9081 305500 6.3571
0.9084 305600 6.3175
0.9087 305700 6.3624
0.9090 305800 6.3954
0.9093 305900 6.4152
0.9096 306000 6.4059
0.9099 306100 6.4016
0.9102 306200 6.3976
0.9105 306300 6.3498
0.9108 306400 6.3638
0.9111 306500 6.4264
0.9114 306600 6.3982
0.9117 306700 6.3428
0.9120 306800 6.3601
0.9123 306900 6.3875
0.9126 307000 6.4401
0.9129 307100 6.3931
0.9132 307200 6.3875
0.9135 307300 6.3293
0.9138 307400 6.3539
0.9141 307500 6.3619
0.9144 307600 6.364
0.9147 307700 6.4567
0.9150 307800 6.393
0.9153 307900 6.4153
0.9156 308000 6.3644
0.9159 308100 6.3899
0.9162 308200 6.3986
0.9165 308300 6.3766
0.9167 308400 6.4279
0.9170 308500 6.3578
0.9173 308600 6.3891
0.9176 308700 6.3029
0.9179 308800 6.3688
0.9182 308900 6.3787
0.9185 309000 6.3935
0.9188 309100 6.4319
0.9191 309200 6.2945
0.9194 309300 6.3871
0.9197 309400 6.3338
0.9200 309500 6.3654
0.9203 309600 6.4207
0.9206 309700 6.3809
0.9209 309800 6.3798
0.9212 309900 6.3974
0.9215 310000 6.334
0.9218 310100 6.376
0.9221 310200 6.3939
0.9224 310300 6.4144
0.9227 310400 6.4375
0.9230 310500 6.316
0.9233 310600 6.3346
0.9236 310700 6.3766
0.9239 310800 6.3564
0.9242 310900 6.3643
0.9245 311000 6.3627
0.9248 311100 6.4283
0.9251 311200 6.3179
0.9254 311300 6.4113
0.9257 311400 6.3703
0.9260 311500 6.3388
0.9263 311600 6.3997
0.9266 311700 6.3813
0.9269 311800 6.3723
0.9272 311900 6.3556
0.9275 312000 6.3522
0.9277 312100 6.3661
0.9280 312200 6.405
0.9283 312300 6.4031
0.9286 312400 6.4125
0.9289 312500 6.3225
0.9292 312600 6.3887
0.9295 312700 6.3368
0.9298 312800 6.3323
0.9301 312900 6.4433
0.9304 313000 6.4155
0.9307 313100 6.3448
0.9310 313200 6.3775
0.9313 313300 6.3736
0.9316 313400 6.3611
0.9319 313500 6.3988
0.9322 313600 6.3243
0.9325 313700 6.4137
0.9328 313800 6.3663
0.9331 313900 6.3742
0.9334 314000 6.4021
0.9337 314100 6.4171
0.9340 314200 6.3948
0.9343 314300 6.3916
0.9346 314400 6.365
0.9349 314500 6.3479
0.9352 314600 6.3588
0.9355 314700 6.3247
0.9358 314800 6.3584
0.9361 314900 6.3436
0.9364 315000 6.3958
0.9367 315100 6.3424
0.9370 315200 6.3814
0.9373 315300 6.3612
0.9376 315400 6.3889
0.9379 315500 6.3591
0.9382 315600 6.3856
0.9384 315700 6.3594
0.9387 315800 6.3737
0.9390 315900 6.4489
0.9393 316000 6.2902
0.9396 316100 6.3517
0.9399 316200 6.4662
0.9402 316300 6.3684
0.9405 316400 6.362
0.9408 316500 6.3492
0.9411 316600 6.4018
0.9414 316700 6.3709
0.9417 316800 6.4048
0.9420 316900 6.3547
0.9423 317000 6.2638
0.9426 317100 6.435
0.9429 317200 6.4028
0.9432 317300 6.39
0.9435 317400 6.3688
0.9438 317500 6.3801
0.9441 317600 6.3609
0.9444 317700 6.3583
0.9447 317800 6.3339
0.9450 317900 6.3804
0.9453 318000 6.3718
0.9456 318100 6.3434
0.9459 318200 6.3765
0.9462 318300 6.3468
0.9465 318400 6.3253
0.9468 318500 6.3868
0.9471 318600 6.3906
0.9474 318700 6.4371
0.9477 318800 6.3737
0.9480 318900 6.3332
0.9483 319000 6.3698
0.9486 319100 6.3748
0.9489 319200 6.4309
0.9492 319300 6.3757
0.9494 319400 6.3615
0.9497 319500 6.366
0.9500 319600 6.3574
0.9503 319700 6.3742
0.9506 319800 6.3461
0.9509 319900 6.3063
0.9512 320000 6.3504
0.9515 320100 6.4292
0.9518 320200 6.3603
0.9521 320300 6.3664
0.9524 320400 6.4065
0.9527 320500 6.3696
0.9530 320600 6.4512
0.9533 320700 6.3765
0.9536 320800 6.319
0.9539 320900 6.3873
0.9542 321000 6.4429
0.9545 321100 6.4334
0.9548 321200 6.3168
0.9551 321300 6.4112
0.9554 321400 6.4135
0.9557 321500 6.3718
0.9560 321600 6.393
0.9563 321700 6.331
0.9566 321800 6.3811
0.9569 321900 6.3748
0.9572 322000 6.4013
0.9575 322100 6.3281
0.9578 322200 6.3634
0.9581 322300 6.3473
0.9584 322400 6.3429
0.9587 322500 6.3837
0.9590 322600 6.3855
0.9593 322700 6.3825
0.9596 322800 6.4182
0.9599 322900 6.3611
0.9601 323000 6.4276
0.9604 323100 6.3329
0.9607 323200 6.3764
0.9610 323300 6.3382
0.9613 323400 6.3084
0.9616 323500 6.3884
0.9619 323600 6.3733
0.9622 323700 6.3145
0.9625 323800 6.4082
0.9628 323900 6.2616
0.9631 324000 6.3564
0.9634 324100 6.4159
0.9637 324200 6.3898
0.9640 324300 6.3522
0.9643 324400 6.3905
0.9646 324500 6.3628
0.9649 324600 6.3219
0.9652 324700 6.4094
0.9655 324800 6.4043
0.9658 324900 6.405
0.9661 325000 6.3272
0.9664 325100 6.3852
0.9667 325200 6.4279
0.9670 325300 6.385
0.9673 325400 6.432
0.9676 325500 6.4317
0.9679 325600 6.3754
0.9682 325700 6.4305
0.9685 325800 6.313
0.9688 325900 6.3338
0.9691 326000 6.4271
0.9694 326100 6.4092
0.9697 326200 6.3071
0.9700 326300 6.3712
0.9703 326400 6.3486
0.9706 326500 6.3041
0.9709 326600 6.3464
0.9711 326700 6.3351
0.9714 326800 6.3166
0.9717 326900 6.3343
0.9720 327000 6.403
0.9723 327100 6.3923
0.9726 327200 6.4203
0.9729 327300 6.3716
0.9732 327400 6.3341
0.9735 327500 6.3253
0.9738 327600 6.3648
0.9741 327700 6.4148
0.9744 327800 6.3431
0.9747 327900 6.3149
0.9750 328000 6.3697
0.9753 328100 6.3777
0.9756 328200 6.3446
0.9759 328300 6.3484
0.9762 328400 6.3118
0.9765 328500 6.3657
0.9768 328600 6.4045
0.9771 328700 6.3776
0.9774 328800 6.3609
0.9777 328900 6.3024
0.9780 329000 6.4298
0.9783 329100 6.3598
0.9786 329200 6.3555
0.9789 329300 6.3915
0.9792 329400 6.3807
0.9795 329500 6.2983
0.9798 329600 6.371
0.9801 329700 6.3647
0.9804 329800 6.3892
0.9807 329900 6.3543
0.9810 330000 6.4178
0.9813 330100 6.3228
0.9816 330200 6.3684
0.9818 330300 6.3711
0.9821 330400 6.3717
0.9824 330500 6.3976
0.9827 330600 6.3483
0.9830 330700 6.335
0.9833 330800 6.385
0.9836 330900 6.3772
0.9839 331000 6.3027
0.9842 331100 6.3634
0.9845 331200 6.3261
0.9848 331300 6.3708
0.9851 331400 6.3993
0.9854 331500 6.3759
0.9857 331600 6.3485
0.9860 331700 6.3717
0.9863 331800 6.3776
0.9866 331900 6.4366
0.9869 332000 6.4023
0.9872 332100 6.3978
0.9875 332200 6.3382
0.9878 332300 6.3474
0.9881 332400 6.4122
0.9884 332500 6.3809
0.9887 332600 6.322
0.9890 332700 6.344
0.9893 332800 6.2637
0.9896 332900 6.4016
0.9899 333000 6.3826
0.9902 333100 6.4467
0.9905 333200 6.4596
0.9908 333300 6.3065
0.9911 333400 6.4057
0.9914 333500 6.435
0.9917 333600 6.3398
0.9920 333700 6.3741
0.9923 333800 6.3069
0.9926 333900 6.3457
0.9928 334000 6.3884
0.9931 334100 6.4078
0.9934 334200 6.3242
0.9937 334300 6.3621
0.9940 334400 6.3515
0.9943 334500 6.4017
0.9946 334600 6.4629
0.9949 334700 6.3686
0.9952 334800 6.3224
0.9955 334900 6.386
0.9958 335000 6.3899
0.9961 335100 6.3488
0.9964 335200 6.4117
0.9967 335300 6.3988
0.9970 335400 6.3536
0.9973 335500 6.3861
0.9976 335600 6.3383
0.9979 335700 6.3848
0.9982 335800 6.4582
0.9985 335900 6.3452
0.9988 336000 6.3651
0.9991 336100 6.3704
0.9994 336200 6.3801
0.9997 336300 6.3701
1.0000 336400 6.4452

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.4
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.10.1
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
8
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for KhaledReda/all-MiniLM-L6-v38-pair_score

Finetuned
(816)
this model

Dataset used to train KhaledReda/all-MiniLM-L6-v38-pair_score

Paper for KhaledReda/all-MiniLM-L6-v38-pair_score