all-MiniLM-L6-v36-pair_score

This is a sentence-transformers model finetuned from KhaledReda/all-MiniLM-L6-v36-pair_score on the pairs_three_ways_noise_reduction_v1 dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'spr 2 ex.new',
    'midi length dress - navy blue color navy blue size midi 65 cotton 35 polyester.',
    'cupcakes christmas per piece.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000, -0.2703, -0.1048],
#         [-0.2703,  1.0000, -0.0432],
#         [-0.1048, -0.0432,  1.0000]])

Training Details

Training Dataset

pairs_three_ways_noise_reduction_v1

  • Dataset: pairs_three_ways_noise_reduction_v1 at 363acad
  • Size: 19,764,079 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.32 tokens
    • max: 17 tokens
    • min: 3 tokens
    • mean: 64.26 tokens
    • max: 225 tokens
    • min: 0.0
    • mean: 0.05
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    qahwa dahlia ruffle 2 piece swimsuit gender women brand rafeya generic name swimsuit product name dahlia size xs types of fashion styles beachwear cut two pieces ruffled units 2 piece 0.0
    20/tab ex.new woof fetch dog food turkey - small size small spoil your dog with the delectable taste of woof fetch turkey. packed with quality ingredients it s a savory and nutritious meal for your cherished companion. 0.0
    bath linen run support women s running shoes green gender women brand decathlon generic name shoes size 3 36 features cushioning support color green activity running target group women sport running our design teams developed these supportive women s shoes with cushioning for running up to 10 km per week. with cushioning and special laces for support these women s running shoes are safe and comfortable on the road and trails. 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Evaluation Dataset

pairs_three_ways_noise_reduction_v1

  • Dataset: pairs_three_ways_noise_reduction_v1 at 363acad
  • Size: 99,317 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 5.45 tokens
    • max: 16 tokens
    • min: 3 tokens
    • mean: 66.96 tokens
    • max: 239 tokens
    • min: 0.0
    • mean: 0.04
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    lines pot wild legging extension gender women age we didn t want you to have to purchase multiple leggings to go with your different rashguards when you don t need that many. so we designed these legging-extensions that you can wear over your leggings to add coherence in your outfit and color into your basic black leggings it s a simple and creative way to create so many outfits easily. you must size down the legging extensions not like the size of the leggings. 0.0
    blended nan matcha pink and baby blue bracelet color pink blue target group baby bracelet inside material copper silver plated. 0.0
    bunnzy outfit nourish hair therapy energizing hydrating hair treatment with peppermint rosemary and lavender essential oils. the botanical plant oil blend nourishes and conditions hair and scalp while helping to maintain the proper oil balance to promote softer more manageable hair. the essential oil blend has many benefits for the hair and scalp peppermint oil gently stimulates the scalp and helps hair receive proper nourishment to grow. lavender oil is soothing and calming for hair and helps to balance oil production to replenish hair and scalp. rosemary oil works well to remedy dry hair and itchy scalp. for weekly intense hydrating treatment massage into scalp and hair from roots to ends. wrap hair in a thin tow 0.0
  • Loss: CoSENTLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1
  • fp16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0006 100 0.259
0.0013 200 0.279
0.0019 300 0.2344
0.0026 400 0.3457
0.0032 500 0.2702
0.0039 600 0.3346
0.0045 700 0.223
0.0052 800 0.1661
0.0058 900 0.296
0.0065 1000 0.344
0.0071 1100 0.2829
0.0078 1200 0.4639
0.0084 1300 0.3604
0.0091 1400 0.1457
0.0097 1500 0.0742
0.0104 1600 0.3248
0.0110 1700 0.097
0.0117 1800 0.1162
0.0123 1900 0.1608
0.0130 2000 0.1002
0.0136 2100 0.1361
0.0142 2200 0.1319
0.0149 2300 0.1192
0.0155 2400 0.1991
0.0162 2500 0.1832
0.0168 2600 0.0877
0.0175 2700 0.1192
0.0181 2800 0.1245
0.0188 2900 0.0773
0.0194 3000 0.2334
0.0201 3100 0.1719
0.0207 3200 0.1367
0.0214 3300 0.0215
0.0220 3400 0.053
0.0227 3500 0.0998
0.0233 3600 0.0643
0.0240 3700 0.2129
0.0246 3800 0.0867
0.0253 3900 0.0877
0.0259 4000 0.0463
0.0266 4100 0.1647
0.0272 4200 0.0719
0.0278 4300 0.1512
0.0285 4400 0.1003
0.0291 4500 0.123
0.0298 4600 0.1753
0.0304 4700 0.0695
0.0311 4800 0.1102
0.0317 4900 0.1685
0.0324 5000 0.0635
0.0330 5100 0.1746
0.0337 5200 0.2026
0.0343 5300 0.11
0.0350 5400 0.0866
0.0356 5500 0.0698
0.0363 5600 0.1037
0.0369 5700 0.1343
0.0376 5800 0.0314
0.0382 5900 0.0822
0.0389 6000 0.0986
0.0395 6100 0.0795
0.0402 6200 0.1384
0.0408 6300 0.0839
0.0414 6400 0.1448
0.0421 6500 0.0511
0.0427 6600 0.0399
0.0434 6700 0.0311
0.0440 6800 0.0832
0.0447 6900 0.0896
0.0453 7000 0.1081
0.0460 7100 0.1334
0.0466 7200 0.0465
0.0473 7300 0.1098
0.0479 7400 0.0598
0.0486 7500 0.0837
0.0492 7600 0.1034
0.0499 7700 0.029
0.0505 7800 0.0595
0.0512 7900 0.066
0.0518 8000 0.0667
0.0525 8100 0.1442
0.0531 8200 0.1957
0.0538 8300 0.0941
0.0544 8400 0.0958
0.0550 8500 0.1317
0.0557 8600 0.132
0.0563 8700 0.0534
0.0570 8800 0.0724
0.0576 8900 0.1029
0.0583 9000 0.105
0.0589 9100 0.0802
0.0596 9200 0.0364
0.0602 9300 0.1144
0.0609 9400 0.1106
0.0615 9500 0.0947
0.0622 9600 0.1798
0.0628 9700 0.0487
0.0635 9800 0.1271
0.0641 9900 0.1112
0.0648 10000 0.0567
0.0654 10100 0.0344
0.0661 10200 0.0921
0.0667 10300 0.0216
0.0674 10400 0.0702
0.0680 10500 0.142
0.0686 10600 0.1187
0.0693 10700 0.1104
0.0699 10800 0.0881
0.0706 10900 0.1065
0.0712 11000 0.0243
0.0719 11100 0.046
0.0725 11200 0.0479
0.0732 11300 0.0516
0.0738 11400 0.0718
0.0745 11500 0.0481
0.0751 11600 0.029
0.0758 11700 0.0752
0.0764 11800 0.0844
0.0771 11900 0.0382
0.0777 12000 0.0878
0.0784 12100 0.1511
0.0790 12200 0.0596
0.0797 12300 0.0922
0.0803 12400 0.0133
0.0810 12500 0.0775
0.0816 12600 0.0999
0.0823 12700 0.0512
0.0829 12800 0.1226
0.0835 12900 0.1409
0.0842 13000 0.1595
0.0848 13100 0.0787
0.0855 13200 0.1223
0.0861 13300 0.2356
0.0868 13400 0.0216
0.0874 13500 0.0535
0.0881 13600 0.0961
0.0887 13700 0.0831
0.0894 13800 0.0357
0.0900 13900 0.0839
0.0907 14000 0.1621
0.0913 14100 0.1452
0.0920 14200 0.0597
0.0926 14300 0.068
0.0933 14400 0.0291
0.0939 14500 0.1022
0.0946 14600 0.0928
0.0952 14700 0.1007
0.0959 14800 0.1816
0.0965 14900 0.1176
0.0971 15000 0.1454
0.0978 15100 0.0783
0.0984 15200 0.0951
0.0991 15300 0.0217
0.0997 15400 0.0345
0.1004 15500 0.1047
0.1010 15600 0.0966
0.1017 15700 0.1663
0.1023 15800 0.1359
0.1030 15900 0.0866
0.1036 16000 0.0914
0.1043 16100 0.0869
0.1049 16200 0.0288
0.1056 16300 0.0512
0.1062 16400 0.0459
0.1069 16500 0.0844
0.1075 16600 0.0574
0.1082 16700 0.09
0.1088 16800 0.0717
0.1095 16900 0.0587
0.1101 17000 0.0255
0.1107 17100 0.048
0.1114 17200 0.1286
0.1120 17300 0.1196
0.1127 17400 0.1476
0.1133 17500 0.0417
0.1140 17600 0.0925
0.1146 17700 0.1106
0.1153 17800 0.0487
0.1159 17900 0.0595
0.1166 18000 0.1187
0.1172 18100 0.045
0.1179 18200 0.1279
0.1185 18300 0.122
0.1192 18400 0.0638
0.1198 18500 0.0798
0.1205 18600 0.1502
0.1211 18700 0.0489
0.1218 18800 0.0747
0.1224 18900 0.1417
0.1231 19000 0.0671
0.1237 19100 0.04
0.1243 19200 0.031
0.1250 19300 0.0868
0.1256 19400 0.076
0.1263 19500 0.058
0.1269 19600 0.0709
0.1276 19700 0.1214
0.1282 19800 0.0211
0.1289 19900 0.0663
0.1295 20000 0.0677
0.1302 20100 0.1445
0.1308 20200 0.0949
0.1315 20300 0.0315
0.1321 20400 0.1157
0.1328 20500 0.1015
0.1334 20600 0.0795
0.1341 20700 0.1039
0.1347 20800 0.0573
0.1354 20900 0.1311
0.1360 21000 0.0647
0.1367 21100 0.0591
0.1373 21200 0.0775
0.1379 21300 0.121
0.1386 21400 0.086
0.1392 21500 0.0775
0.1399 21600 0.0746
0.1405 21700 0.0715
0.1412 21800 0.1398
0.1418 21900 0.0901
0.1425 22000 0.1747
0.1431 22100 0.0627
0.1438 22200 0.0415
0.1444 22300 0.0249
0.1451 22400 0.0863
0.1457 22500 0.0861
0.1464 22600 0.0496
0.1470 22700 0.0513
0.1477 22800 0.21
0.1483 22900 0.0543
0.1490 23000 0.0907
0.1496 23100 0.0455
0.1503 23200 0.1405
0.1509 23300 0.0384
0.1515 23400 0.1444
0.1522 23500 0.1216
0.1528 23600 0.0123
0.1535 23700 0.1084
0.1541 23800 0.0385
0.1548 23900 0.0856
0.1554 24000 0.026
0.1561 24100 0.069
0.1567 24200 0.0309
0.1574 24300 0.1088
0.1580 24400 0.0187
0.1587 24500 0.0809
0.1593 24600 0.068
0.1600 24700 0.0628
0.1606 24800 0.1488
0.1613 24900 0.0617
0.1619 25000 0.1207
0.1626 25100 0.0311
0.1632 25200 0.0964
0.1639 25300 0.0561
0.1645 25400 0.0735
0.1651 25500 0.0182
0.1658 25600 0.1115
0.1664 25700 0.0628
0.1671 25800 0.144
0.1677 25900 0.1596
0.1684 26000 0.0612
0.1690 26100 0.1275
0.1697 26200 0.1007
0.1703 26300 0.1534
0.1710 26400 0.0757
0.1716 26500 0.1128
0.1723 26600 0.0503
0.1729 26700 0.1042
0.1736 26800 0.015
0.1742 26900 0.0747
0.1749 27000 0.0365
0.1755 27100 0.0794
0.1762 27200 0.0535
0.1768 27300 0.0256
0.1775 27400 0.0758
0.1781 27500 0.0444
0.1787 27600 0.039
0.1794 27700 0.0864
0.1800 27800 0.0653
0.1807 27900 0.1293
0.1813 28000 0.0382
0.1820 28100 0.1008
0.1826 28200 0.0748
0.1833 28300 0.0499
0.1839 28400 0.0839
0.1846 28500 0.0435
0.1852 28600 0.0616
0.1859 28700 0.0678
0.1865 28800 0.0274
0.1872 28900 0.0404
0.1878 29000 0.1013
0.1885 29100 0.1028
0.1891 29200 0.0855
0.1898 29300 0.0437
0.1904 29400 0.1489
0.1911 29500 0.0632
0.1917 29600 0.0125
0.1923 29700 0.0489
0.1930 29800 0.022
0.1936 29900 0.128
0.1943 30000 0.0465
0.1949 30100 0.0726
0.1956 30200 0.0417
0.1962 30300 0.0488
0.1969 30400 0.0175
0.1975 30500 0.0609
0.1982 30600 0.0353
0.1988 30700 0.0239
0.1995 30800 0.0255
0.2001 30900 0.0591
0.2008 31000 0.0369
0.2014 31100 0.0964
0.2021 31200 0.0823
0.2027 31300 0.0117
0.2034 31400 0.0172
0.2040 31500 0.0396
0.2047 31600 0.043
0.2053 31700 0.05
0.2059 31800 0.0408
0.2066 31900 0.0568
0.2072 32000 0.0435
0.2079 32100 0.0644
0.2085 32200 0.0541
0.2092 32300 0.0922
0.2098 32400 0.0269
0.2105 32500 0.0481
0.2111 32600 0.09
0.2118 32700 0.0448
0.2124 32800 0.1076
0.2131 32900 0.0924
0.2137 33000 0.0138
0.2144 33100 0.1283
0.2150 33200 0.0489
0.2157 33300 0.0318
0.2163 33400 0.0577
0.2170 33500 0.0733
0.2176 33600 0.0521
0.2183 33700 0.0091
0.2189 33800 0.0492
0.2195 33900 0.1607
0.2202 34000 0.1402
0.2208 34100 0.0162
0.2215 34200 0.0367
0.2221 34300 0.0719
0.2228 34400 0.1095
0.2234 34500 0.0305
0.2241 34600 0.0403
0.2247 34700 0.0615
0.2254 34800 0.0616
0.2260 34900 0.0195
0.2267 35000 0.1135
0.2273 35100 0.0832
0.2280 35200 0.0162
0.2286 35300 0.072
0.2293 35400 0.0632
0.2299 35500 0.047
0.2306 35600 0.0198
0.2312 35700 0.0898
0.2319 35800 0.0951
0.2325 35900 0.0779
0.2332 36000 0.1065
0.2338 36100 0.027
0.2344 36200 0.0812
0.2351 36300 0.005
0.2357 36400 0.0334
0.2364 36500 0.0706
0.2370 36600 0.0285
0.2377 36700 0.0731
0.2383 36800 0.094
0.2390 36900 0.1747
0.2396 37000 0.1123
0.2403 37100 0.2034
0.2409 37200 0.0117
0.2416 37300 0.0551
0.2422 37400 0.0031
0.2429 37500 0.0282
0.2435 37600 0.0376
0.2442 37700 0.0961
0.2448 37800 0.0395
0.2455 37900 0.083
0.2461 38000 0.0548
0.2468 38100 0.0042
0.2474 38200 0.1096
0.2480 38300 0.0396
0.2487 38400 0.1214
0.2493 38500 0.0414
0.2500 38600 0.0277
0.2506 38700 0.1565
0.2513 38800 0.0458
0.2519 38900 0.1661
0.2526 39000 0.0313
0.2532 39100 0.0489
0.2539 39200 0.0742
0.2545 39300 0.0415
0.2552 39400 0.0101
0.2558 39500 0.0898
0.2565 39600 0.0701
0.2571 39700 0.0397
0.2578 39800 0.0427
0.2584 39900 0.1023
0.2591 40000 0.0218
0.2597 40100 0.0971
0.2604 40200 0.0427
0.2610 40300 0.0476
0.2616 40400 0.0193
0.2623 40500 0.0082
0.2629 40600 0.0446
0.2636 40700 0.0665
0.2642 40800 0.1209
0.2649 40900 0.0788
0.2655 41000 0.0798
0.2662 41100 0.0786
0.2668 41200 0.1807
0.2675 41300 0.0228
0.2681 41400 0.0345
0.2688 41500 0.041
0.2694 41600 0.1004
0.2701 41700 0.0508
0.2707 41800 0.0407
0.2714 41900 0.0545
0.2720 42000 0.0574
0.2727 42100 0.039
0.2733 42200 0.0633
0.2740 42300 0.0455
0.2746 42400 0.0237
0.2752 42500 0.0406
0.2759 42600 0.0484
0.2765 42700 0.0804
0.2772 42800 0.0587
0.2778 42900 0.0388
0.2785 43000 0.1189
0.2791 43100 0.0787
0.2798 43200 0.0508
0.2804 43300 0.0652
0.2811 43400 0.036
0.2817 43500 0.0908
0.2824 43600 0.0439
0.2830 43700 0.0152
0.2837 43800 0.0228
0.2843 43900 0.1288
0.2850 44000 0.068
0.2856 44100 0.0287
0.2863 44200 0.0325
0.2869 44300 0.0388
0.2876 44400 0.0166
0.2882 44500 0.0264
0.2888 44600 0.0281
0.2895 44700 0.0219
0.2901 44800 0.1157
0.2908 44900 0.0637
0.2914 45000 0.0696
0.2921 45100 0.0379
0.2927 45200 0.0562
0.2934 45300 0.0438
0.2940 45400 0.0754
0.2947 45500 0.0379
0.2953 45600 0.0192
0.2960 45700 0.0823
0.2966 45800 0.0778
0.2973 45900 0.0769
0.2979 46000 0.0164
0.2986 46100 0.052
0.2992 46200 0.0229
0.2999 46300 0.0883
0.3005 46400 0.0263
0.3012 46500 0.1259
0.3018 46600 0.0282
0.3024 46700 0.064
0.3031 46800 0.0285
0.3037 46900 0.1078
0.3044 47000 0.0551
0.3050 47100 0.0191
0.3057 47200 0.0286
0.3063 47300 0.0249
0.3070 47400 0.0129
0.3076 47500 0.0061
0.3083 47600 0.0863
0.3089 47700 0.0452
0.3096 47800 0.036
0.3102 47900 0.1282
0.3109 48000 0.0795
0.3115 48100 0.0566
0.3122 48200 0.0167
0.3128 48300 0.0257
0.3135 48400 0.1144
0.3141 48500 0.0813
0.3148 48600 0.0567
0.3154 48700 0.0664
0.3160 48800 0.1012
0.3167 48900 0.0737
0.3173 49000 0.1064
0.3180 49100 0.0753
0.3186 49200 0.0811
0.3193 49300 0.0913
0.3199 49400 0.1252
0.3206 49500 0.1153
0.3212 49600 0.0405
0.3219 49700 0.0381
0.3225 49800 0.0155
0.3232 49900 0.0981
0.3238 50000 0.0691
0.3245 50100 0.0216
0.3251 50200 0.0322
0.3258 50300 0.0208
0.3264 50400 0.1002
0.3271 50500 0.0504
0.3277 50600 0.0149
0.3284 50700 0.075
0.3290 50800 0.0302
0.3296 50900 0.0874
0.3303 51000 0.0477
0.3309 51100 0.081
0.3316 51200 0.0336
0.3322 51300 0.0609
0.3329 51400 0.0263
0.3335 51500 0.0218
0.3342 51600 0.0925
0.3348 51700 0.0389
0.3355 51800 0.057
0.3361 51900 0.1215
0.3368 52000 0.0183
0.3374 52100 0.0189
0.3381 52200 0.036
0.3387 52300 0.0414
0.3394 52400 0.0472
0.3400 52500 0.058
0.3407 52600 0.0529
0.3413 52700 0.0394
0.3420 52800 0.0952
0.3426 52900 0.0393
0.3432 53000 0.0225
0.3439 53100 0.0565
0.3445 53200 0.034
0.3452 53300 0.1416
0.3458 53400 0.0161
0.3465 53500 0.0378
0.3471 53600 0.0774
0.3478 53700 0.0665
0.3484 53800 0.0368
0.3491 53900 0.1426
0.3497 54000 0.0086
0.3504 54100 0.0759
0.3510 54200 0.0186
0.3517 54300 0.0284
0.3523 54400 0.0501
0.3530 54500 0.1038
0.3536 54600 0.1045
0.3543 54700 0.0088
0.3549 54800 0.0596
0.3556 54900 0.0388
0.3562 55000 0.0878
0.3568 55100 0.0403
0.3575 55200 0.1564
0.3581 55300 0.0863
0.3588 55400 0.0232
0.3594 55500 0.0057
0.3601 55600 0.019
0.3607 55700 0.0104
0.3614 55800 0.1776
0.3620 55900 0.0481
0.3627 56000 0.0536
0.3633 56100 0.1083
0.3640 56200 0.0176
0.3646 56300 0.0633
0.3653 56400 0.026
0.3659 56500 0.0944
0.3666 56600 0.0384
0.3672 56700 0.0706
0.3679 56800 0.014
0.3685 56900 0.0664
0.3692 57000 0.0423
0.3698 57100 0.169
0.3704 57200 0.0429
0.3711 57300 0.0146
0.3717 57400 0.1533
0.3724 57500 0.0249
0.3730 57600 0.0803
0.3737 57700 0.0299
0.3743 57800 0.0354
0.3750 57900 0.0321
0.3756 58000 0.1391
0.3763 58100 0.0234
0.3769 58200 0.0319
0.3776 58300 0.0383
0.3782 58400 0.023
0.3789 58500 0.0743
0.3795 58600 0.0823
0.3802 58700 0.0831
0.3808 58800 0.1142
0.3815 58900 0.0752
0.3821 59000 0.0553
0.3828 59100 0.0264
0.3834 59200 0.0601
0.3840 59300 0.0359
0.3847 59400 0.1115
0.3853 59500 0.0186
0.3860 59600 0.0132
0.3866 59700 0.0614
0.3873 59800 0.079
0.3879 59900 0.0477
0.3886 60000 0.0144
0.3892 60100 0.0238
0.3899 60200 0.0689
0.3905 60300 0.0435
0.3912 60400 0.0652
0.3918 60500 0.1035
0.3925 60600 0.0321
0.3931 60700 0.046
0.3938 60800 0.0156
0.3944 60900 0.0258
0.3951 61000 0.0179
0.3957 61100 0.0379
0.3964 61200 0.0097
0.3970 61300 0.0576
0.3977 61400 0.0321
0.3983 61500 0.0713
0.3989 61600 0.0128
0.3996 61700 0.0255
0.4002 61800 0.0643
0.4009 61900 0.0049
0.4015 62000 0.058
0.4022 62100 0.0149
0.4028 62200 0.0228
0.4035 62300 0.1486
0.4041 62400 0.0367
0.4048 62500 0.0079
0.4054 62600 0.0288
0.4061 62700 0.0055
0.4067 62800 0.0601
0.4074 62900 0.033
0.4080 63000 0.0346
0.4087 63100 0.0345
0.4093 63200 0.1134
0.4100 63300 0.0713
0.4106 63400 0.0627
0.4113 63500 0.0657
0.4119 63600 0.0196
0.4125 63700 0.059
0.4132 63800 0.1049
0.4138 63900 0.0093
0.4145 64000 0.0271
0.4151 64100 0.0658
0.4158 64200 0.0237
0.4164 64300 0.1293
0.4171 64400 0.0337
0.4177 64500 0.0435
0.4184 64600 0.0235
0.4190 64700 0.0176
0.4197 64800 0.0589
0.4203 64900 0.0063
0.4210 65000 0.0849
0.4216 65100 0.0132
0.4223 65200 0.036
0.4229 65300 0.0191
0.4236 65400 0.0512
0.4242 65500 0.1231
0.4249 65600 0.0317
0.4255 65700 0.0664
0.4261 65800 0.0083
0.4268 65900 0.0816
0.4274 66000 0.0613
0.4281 66100 0.0527
0.4287 66200 0.0433
0.4294 66300 0.0449
0.4300 66400 0.0331
0.4307 66500 0.0829
0.4313 66600 0.0647
0.4320 66700 0.0343
0.4326 66800 0.0274
0.4333 66900 0.0587
0.4339 67000 0.0081
0.4346 67100 0.0503
0.4352 67200 0.1261
0.4359 67300 0.1291
0.4365 67400 0.0563
0.4372 67500 0.0208
0.4378 67600 0.0418
0.4385 67700 0.0253
0.4391 67800 0.0374
0.4397 67900 0.1436
0.4404 68000 0.1126
0.4410 68100 0.0341
0.4417 68200 0.0806
0.4423 68300 0.0551
0.4430 68400 0.1494
0.4436 68500 0.07
0.4443 68600 0.0172
0.4449 68700 0.0335
0.4456 68800 0.0318
0.4462 68900 0.045
0.4469 69000 0.0252
0.4475 69100 0.0421
0.4482 69200 0.0359
0.4488 69300 0.0156
0.4495 69400 0.0305
0.4501 69500 0.012
0.4508 69600 0.0429
0.4514 69700 0.0652
0.4521 69800 0.0891
0.4527 69900 0.1409
0.4533 70000 0.0529
0.4540 70100 0.0473
0.4546 70200 0.0546
0.4553 70300 0.069
0.4559 70400 0.072
0.4566 70500 0.0391
0.4572 70600 0.0967
0.4579 70700 0.043
0.4585 70800 0.0429
0.4592 70900 0.0812
0.4598 71000 0.1186
0.4605 71100 0.0095
0.4611 71200 0.0043
0.4618 71300 0.2469
0.4624 71400 0.0122
0.4631 71500 0.0247
0.4637 71600 0.1117
0.4644 71700 0.0703
0.4650 71800 0.0837
0.4657 71900 0.047
0.4663 72000 0.0652
0.4669 72100 0.0397
0.4676 72200 0.0324
0.4682 72300 0.0807
0.4689 72400 0.0172
0.4695 72500 0.0256
0.4702 72600 0.0259
0.4708 72700 0.0187
0.4715 72800 0.0055
0.4721 72900 0.065
0.4728 73000 0.1254
0.4734 73100 0.0688
0.4741 73200 0.0333
0.4747 73300 0.0059
0.4754 73400 0.0449
0.4760 73500 0.1274
0.4767 73600 0.0766
0.4773 73700 0.1242
0.4780 73800 0.0219
0.4786 73900 0.0185
0.4793 74000 0.0219
0.4799 74100 0.0408
0.4805 74200 0.018
0.4812 74300 0.0436
0.4818 74400 0.1348
0.4825 74500 0.065
0.4831 74600 0.0069
0.4838 74700 0.0272
0.4844 74800 0.0173
0.4851 74900 0.0782
0.4857 75000 0.0433
0.4864 75100 0.0474
0.4870 75200 0.0449
0.4877 75300 0.1053
0.4883 75400 0.0231
0.4890 75500 0.0107
0.4896 75600 0.0619
0.4903 75700 0.0146
0.4909 75800 0.0447
0.4916 75900 0.0565
0.4922 76000 0.0811
0.4929 76100 0.0873
0.4935 76200 0.0647
0.4941 76300 0.0126
0.4948 76400 0.012
0.4954 76500 0.1983
0.4961 76600 0.0482
0.4967 76700 0.014
0.4974 76800 0.1074
0.4980 76900 0.0716
0.4987 77000 0.0307
0.4993 77100 0.0733
0.5000 77200 0.028
0.5006 77300 0.0562
0.5013 77400 0.0364
0.5019 77500 0.0269
0.5026 77600 0.0747
0.5032 77700 0.0566
0.5039 77800 0.0337
0.5045 77900 0.0621
0.5052 78000 0.0892
0.5058 78100 0.0678
0.5065 78200 0.0072
0.5071 78300 0.0233
0.5077 78400 0.1255
0.5084 78500 0.0098
0.5090 78600 0.0202
0.5097 78700 0.0183
0.5103 78800 0.0319
0.5110 78900 0.0599
0.5116 79000 0.1177
0.5123 79100 0.0479
0.5129 79200 0.0311
0.5136 79300 0.0349
0.5142 79400 0.0148
0.5149 79500 0.0077
0.5155 79600 0.0718
0.5162 79700 0.0733
0.5168 79800 0.0423
0.5175 79900 0.0621
0.5181 80000 0.0838
0.5188 80100 0.0604
0.5194 80200 0.0132
0.5201 80300 0.0395
0.5207 80400 0.062
0.5213 80500 0.0043
0.5220 80600 0.0817
0.5226 80700 0.0157
0.5233 80800 0.0326
0.5239 80900 0.0226
0.5246 81000 0.0065
0.5252 81100 0.0133
0.5259 81200 0.0199
0.5265 81300 0.0425
0.5272 81400 0.0044
0.5278 81500 0.0278
0.5285 81600 0.0152
0.5291 81700 0.0801
0.5298 81800 0.0188
0.5304 81900 0.0883
0.5311 82000 0.0804
0.5317 82100 0.017
0.5324 82200 0.0564
0.5330 82300 0.0723
0.5337 82400 0.0141
0.5343 82500 0.024
0.5349 82600 0.0363
0.5356 82700 0.0298
0.5362 82800 0.1487
0.5369 82900 0.0077
0.5375 83000 0.0357
0.5382 83100 0.0397
0.5388 83200 0.0163
0.5395 83300 0.0633
0.5401 83400 0.022
0.5408 83500 0.0302
0.5414 83600 0.0081
0.5421 83700 0.031
0.5427 83800 0.1142
0.5434 83900 0.139
0.5440 84000 0.0207
0.5447 84100 0.0246
0.5453 84200 0.0479
0.5460 84300 0.0262
0.5466 84400 0.0282
0.5473 84500 0.0365
0.5479 84600 0.0106
0.5486 84700 0.0667
0.5492 84800 0.0135
0.5498 84900 0.0183
0.5505 85000 0.026
0.5511 85100 0.1243
0.5518 85200 0.0521
0.5524 85300 0.0356
0.5531 85400 0.0069
0.5537 85500 0.0099
0.5544 85600 0.02
0.5550 85700 0.0745
0.5557 85800 0.0267
0.5563 85900 0.0309
0.5570 86000 0.0393
0.5576 86100 0.0404
0.5583 86200 0.0288
0.5589 86300 0.0115
0.5596 86400 0.0215
0.5602 86500 0.031
0.5609 86600 0.0316
0.5615 86700 0.0047
0.5622 86800 0.0347
0.5628 86900 0.0246
0.5634 87000 0.0214
0.5641 87100 0.0401
0.5647 87200 0.0204
0.5654 87300 0.1212
0.5660 87400 0.0739
0.5667 87500 0.0454
0.5673 87600 0.0448
0.5680 87700 0.0163
0.5686 87800 0.0125
0.5693 87900 0.0248
0.5699 88000 0.0392
0.5706 88100 0.0632
0.5712 88200 0.0121
0.5719 88300 0.03
0.5725 88400 0.0404
0.5732 88500 0.0049
0.5738 88600 0.037
0.5745 88700 0.0055
0.5751 88800 0.0639
0.5758 88900 0.0363
0.5764 89000 0.0077
0.5770 89100 0.0099
0.5777 89200 0.0066
0.5783 89300 0.0107
0.5790 89400 0.0702
0.5796 89500 0.0138
0.5803 89600 0.0136
0.5809 89700 0.0364
0.5816 89800 0.0253
0.5822 89900 0.0239
0.5829 90000 0.0233
0.5835 90100 0.0399
0.5842 90200 0.0087
0.5848 90300 0.0095
0.5855 90400 0.0196
0.5861 90500 0.0108
0.5868 90600 0.0443
0.5874 90700 0.02
0.5881 90800 0.0368
0.5887 90900 0.0365
0.5894 91000 0.0816
0.5900 91100 0.0243
0.5906 91200 0.0319
0.5913 91300 0.0828
0.5919 91400 0.0226
0.5926 91500 0.0323
0.5932 91600 0.0504
0.5939 91700 0.0079
0.5945 91800 0.0294
0.5952 91900 0.0253
0.5958 92000 0.0221
0.5965 92100 0.0376
0.5971 92200 0.0131
0.5978 92300 0.0277
0.5984 92400 0.0206
0.5991 92500 0.0072
0.5997 92600 0.1004
0.6004 92700 0.0082
0.6010 92800 0.0774
0.6017 92900 0.015
0.6023 93000 0.0621
0.6030 93100 0.0137
0.6036 93200 0.0086
0.6042 93300 0.0353
0.6049 93400 0.0143
0.6055 93500 0.0622
0.6062 93600 0.0172
0.6068 93700 0.0229
0.6075 93800 0.0441
0.6081 93900 0.135
0.6088 94000 0.1326
0.6094 94100 0.1577
0.6101 94200 0.0712
0.6107 94300 0.0167
0.6114 94400 0.0352
0.6120 94500 0.0629
0.6127 94600 0.093
0.6133 94700 0.0589
0.6140 94800 0.1046
0.6146 94900 0.0101
0.6153 95000 0.0437
0.6159 95100 0.021
0.6166 95200 0.0337
0.6172 95300 0.0293
0.6178 95400 0.0183
0.6185 95500 0.004
0.6191 95600 0.0475
0.6198 95700 0.0293
0.6204 95800 0.0144
0.6211 95900 0.0795
0.6217 96000 0.1282
0.6224 96100 0.0188
0.6230 96200 0.0616
0.6237 96300 0.0058
0.6243 96400 0.0553
0.6250 96500 0.0097
0.6256 96600 0.0965
0.6263 96700 0.0107
0.6269 96800 0.0586
0.6276 96900 0.0512
0.6282 97000 0.1051
0.6289 97100 0.0551
0.6295 97200 0.0184
0.6302 97300 0.0161
0.6308 97400 0.0538
0.6314 97500 0.0093
0.6321 97600 0.017
0.6327 97700 0.0197
0.6334 97800 0.0204
0.6340 97900 0.0885
0.6347 98000 0.0287
0.6353 98100 0.0372
0.6360 98200 0.0206
0.6366 98300 0.07
0.6373 98400 0.0168
0.6379 98500 0.0276
0.6386 98600 0.0205
0.6392 98700 0.0082
0.6399 98800 0.0866
0.6405 98900 0.0861
0.6412 99000 0.0242
0.6418 99100 0.0634
0.6425 99200 0.0062
0.6431 99300 0.0528
0.6438 99400 0.0212
0.6444 99500 0.0256
0.6450 99600 0.0238
0.6457 99700 0.1062
0.6463 99800 0.0486
0.6470 99900 0.0733
0.6476 100000 0.0679
0.6483 100100 0.0296
0.6489 100200 0.0652
0.6496 100300 0.1469
0.6502 100400 0.018
0.6509 100500 0.0151
0.6515 100600 0.035
0.6522 100700 0.0209
0.6528 100800 0.0152
0.6535 100900 0.0184
0.6541 101000 0.0369
0.6548 101100 0.0044
0.6554 101200 0.0282
0.6561 101300 0.0142
0.6567 101400 0.0299
0.6574 101500 0.0227
0.6580 101600 0.0135
0.6586 101700 0.0105
0.6593 101800 0.0127
0.6599 101900 0.0217
0.6606 102000 0.0212
0.6612 102100 0.0192
0.6619 102200 0.01
0.6625 102300 0.063
0.6632 102400 0.0906
0.6638 102500 0.0269
0.6645 102600 0.0226
0.6651 102700 0.0118
0.6658 102800 0.0427
0.6664 102900 0.0197
0.6671 103000 0.0171
0.6677 103100 0.0085
0.6684 103200 0.0231
0.6690 103300 0.019
0.6697 103400 0.1104
0.6703 103500 0.0625
0.6710 103600 0.022
0.6716 103700 0.0112
0.6722 103800 0.0076
0.6729 103900 0.0037
0.6735 104000 0.0059
0.6742 104100 0.0594
0.6748 104200 0.0179
0.6755 104300 0.0302
0.6761 104400 0.0266
0.6768 104500 0.0716
0.6774 104600 0.0276
0.6781 104700 0.0283
0.6787 104800 0.0252
0.6794 104900 0.0721
0.6800 105000 0.0616
0.6807 105100 0.0327
0.6813 105200 0.0304
0.6820 105300 0.0725
0.6826 105400 0.0057
0.6833 105500 0.0185
0.6839 105600 0.0116
0.6846 105700 0.0208
0.6852 105800 0.0544
0.6858 105900 0.0084
0.6865 106000 0.0902
0.6871 106100 0.0278
0.6878 106200 0.0077
0.6884 106300 0.0313
0.6891 106400 0.0103
0.6897 106500 0.0063
0.6904 106600 0.0516
0.6910 106700 0.1234
0.6917 106800 0.0079
0.6923 106900 0.0071
0.6930 107000 0.0214
0.6936 107100 0.0374
0.6943 107200 0.0042
0.6949 107300 0.0104
0.6956 107400 0.0488
0.6962 107500 0.0044
0.6969 107600 0.057
0.6975 107700 0.0086
0.6982 107800 0.0201
0.6988 107900 0.0184
0.6995 108000 0.0124
0.7001 108100 0.0047
0.7007 108200 0.007
0.7014 108300 0.0308
0.7020 108400 0.0104
0.7027 108500 0.0177
0.7033 108600 0.0774
0.7040 108700 0.0487
0.7046 108800 0.0055
0.7053 108900 0.0333
0.7059 109000 0.0122
0.7066 109100 0.0257
0.7072 109200 0.0283
0.7079 109300 0.0887
0.7085 109400 0.0383
0.7092 109500 0.0408
0.7098 109600 0.0091
0.7105 109700 0.0063
0.7111 109800 0.0067
0.7118 109900 0.0218
0.7124 110000 0.0172
0.7131 110100 0.0321
0.7137 110200 0.0133
0.7143 110300 0.0171
0.7150 110400 0.022
0.7156 110500 0.0503
0.7163 110600 0.0213
0.7169 110700 0.0789
0.7176 110800 0.0452
0.7182 110900 0.0147
0.7189 111000 0.0079
0.7195 111100 0.0062
0.7202 111200 0.0522
0.7208 111300 0.07
0.7215 111400 0.094
0.7221 111500 0.0278
0.7228 111600 0.0212
0.7234 111700 0.0345
0.7241 111800 0.0297
0.7247 111900 0.0136
0.7254 112000 0.055
0.7260 112100 0.013
0.7267 112200 0.0392
0.7273 112300 0.044
0.7279 112400 0.0175
0.7286 112500 0.0544
0.7292 112600 0.0102
0.7299 112700 0.0198
0.7305 112800 0.0232
0.7312 112900 0.0247
0.7318 113000 0.019
0.7325 113100 0.0366
0.7331 113200 0.0417
0.7338 113300 0.0238
0.7344 113400 0.0377
0.7351 113500 0.0383
0.7357 113600 0.0247
0.7364 113700 0.0276
0.7370 113800 0.0309
0.7377 113900 0.0165
0.7383 114000 0.0075
0.7390 114100 0.0064
0.7396 114200 0.0081
0.7403 114300 0.0449
0.7409 114400 0.0073
0.7415 114500 0.0352
0.7422 114600 0.0099
0.7428 114700 0.0674
0.7435 114800 0.0089
0.7441 114900 0.095
0.7448 115000 0.0356
0.7454 115100 0.011
0.7461 115200 0.0081
0.7467 115300 0.0138
0.7474 115400 0.0414
0.7480 115500 0.0416
0.7487 115600 0.075
0.7493 115700 0.0085
0.7500 115800 0.0127
0.7506 115900 0.0246
0.7513 116000 0.0089
0.7519 116100 0.0108
0.7526 116200 0.0475
0.7532 116300 0.0419
0.7539 116400 0.0056
0.7545 116500 0.0213
0.7551 116600 0.0298
0.7558 116700 0.0255
0.7564 116800 0.0363
0.7571 116900 0.0217
0.7577 117000 0.0474
0.7584 117100 0.0743
0.7590 117200 0.0549
0.7597 117300 0.0423
0.7603 117400 0.0036
0.7610 117500 0.0425
0.7616 117600 0.0087
0.7623 117700 0.0106
0.7629 117800 0.0274
0.7636 117900 0.0289
0.7642 118000 0.0171
0.7649 118100 0.0183
0.7655 118200 0.0228
0.7662 118300 0.0155
0.7668 118400 0.0347
0.7675 118500 0.0238
0.7681 118600 0.0032
0.7687 118700 0.0452
0.7694 118800 0.0268
0.7700 118900 0.0325
0.7707 119000 0.0262
0.7713 119100 0.0687
0.7720 119200 0.0872
0.7726 119300 0.0244
0.7733 119400 0.0439
0.7739 119500 0.012
0.7746 119600 0.0143
0.7752 119700 0.0584
0.7759 119800 0.0092
0.7765 119900 0.04
0.7772 120000 0.0768
0.7778 120100 0.0539
0.7785 120200 0.0406
0.7791 120300 0.0543
0.7798 120400 0.0144
0.7804 120500 0.002
0.7811 120600 0.023
0.7817 120700 0.0539
0.7823 120800 0.0592
0.7830 120900 0.0153
0.7836 121000 0.0042
0.7843 121100 0.0678
0.7849 121200 0.0107
0.7856 121300 0.0255
0.7862 121400 0.0474
0.7869 121500 0.045
0.7875 121600 0.0672
0.7882 121700 0.0798
0.7888 121800 0.02
0.7895 121900 0.0736
0.7901 122000 0.0573
0.7908 122100 0.0039
0.7914 122200 0.0182
0.7921 122300 0.013
0.7927 122400 0.0343
0.7934 122500 0.0147
0.7940 122600 0.0121
0.7947 122700 0.0488
0.7953 122800 0.0232
0.7959 122900 0.002
0.7966 123000 0.0626
0.7972 123100 0.0188
0.7979 123200 0.0163
0.7985 123300 0.0451
0.7992 123400 0.029
0.7998 123500 0.0918
0.8005 123600 0.0253
0.8011 123700 0.0138
0.8018 123800 0.0703
0.8024 123900 0.0813
0.8031 124000 0.0487
0.8037 124100 0.0844
0.8044 124200 0.0041
0.8050 124300 0.0068
0.8057 124400 0.0332
0.8063 124500 0.0233
0.8070 124600 0.0302
0.8076 124700 0.0108
0.8083 124800 0.0123
0.8089 124900 0.0295
0.8095 125000 0.0391
0.8102 125100 0.0153
0.8108 125200 0.0435
0.8115 125300 0.0534
0.8121 125400 0.0545
0.8128 125500 0.0263
0.8134 125600 0.016
0.8141 125700 0.018
0.8147 125800 0.0344
0.8154 125900 0.02
0.8160 126000 0.0124
0.8167 126100 0.0467
0.8173 126200 0.0831
0.8180 126300 0.0177
0.8186 126400 0.0376
0.8193 126500 0.0074
0.8199 126600 0.015
0.8206 126700 0.0401
0.8212 126800 0.0168
0.8219 126900 0.0575
0.8225 127000 0.0009
0.8231 127100 0.0061
0.8238 127200 0.0142
0.8244 127300 0.0844
0.8251 127400 0.0388
0.8257 127500 0.0688
0.8264 127600 0.0063
0.8270 127700 0.0043
0.8277 127800 0.0178
0.8283 127900 0.0137
0.8290 128000 0.0048
0.8296 128100 0.0129
0.8303 128200 0.0304
0.8309 128300 0.0154
0.8316 128400 0.0202
0.8322 128500 0.0489
0.8329 128600 0.0103
0.8335 128700 0.0394
0.8342 128800 0.0635
0.8348 128900 0.0649
0.8355 129000 0.0164
0.8361 129100 0.03
0.8367 129200 0.0031
0.8374 129300 0.0056
0.8380 129400 0.0025
0.8387 129500 0.0069
0.8393 129600 0.0376
0.8400 129700 0.0506
0.8406 129800 0.0138
0.8413 129900 0.03
0.8419 130000 0.0089
0.8426 130100 0.038
0.8432 130200 0.0299
0.8439 130300 0.0369
0.8445 130400 0.0325
0.8452 130500 0.0115
0.8458 130600 0.0305
0.8465 130700 0.0061
0.8471 130800 0.0075
0.8478 130900 0.0206
0.8484 131000 0.0074
0.8491 131100 0.0333
0.8497 131200 0.0323
0.8504 131300 0.0095
0.8510 131400 0.0099
0.8516 131500 0.0194
0.8523 131600 0.0238
0.8529 131700 0.0131
0.8536 131800 0.0031
0.8542 131900 0.0025
0.8549 132000 0.0147
0.8555 132100 0.0377
0.8562 132200 0.04
0.8568 132300 0.0095
0.8575 132400 0.057
0.8581 132500 0.0138
0.8588 132600 0.0048
0.8594 132700 0.0061
0.8601 132800 0.0213
0.8607 132900 0.02
0.8614 133000 0.1095
0.8620 133100 0.0135
0.8627 133200 0.0074
0.8633 133300 0.0087
0.8640 133400 0.0136
0.8646 133500 0.0089
0.8652 133600 0.0081
0.8659 133700 0.0144
0.8665 133800 0.0133
0.8672 133900 0.028
0.8678 134000 0.0254
0.8685 134100 0.0094
0.8691 134200 0.0045
0.8698 134300 0.0141
0.8704 134400 0.0465
0.8711 134500 0.0103
0.8717 134600 0.049
0.8724 134700 0.0058
0.8730 134800 0.0464
0.8737 134900 0.0076
0.8743 135000 0.0271
0.8750 135100 0.0129
0.8756 135200 0.0138
0.8763 135300 0.0962
0.8769 135400 0.0637
0.8776 135500 0.0492
0.8782 135600 0.0232
0.8788 135700 0.0544
0.8795 135800 0.0098
0.8801 135900 0.0109
0.8808 136000 0.0514
0.8814 136100 0.0333
0.8821 136200 0.0676
0.8827 136300 0.0219
0.8834 136400 0.0484
0.8840 136500 0.0138
0.8847 136600 0.0285
0.8853 136700 0.0283
0.8860 136800 0.0444
0.8866 136900 0.0458
0.8873 137000 0.0859
0.8879 137100 0.0369
0.8886 137200 0.0017
0.8892 137300 0.0382
0.8899 137400 0.024
0.8905 137500 0.0391
0.8912 137600 0.0357
0.8918 137700 0.0299
0.8924 137800 0.0236
0.8931 137900 0.0252
0.8937 138000 0.0036
0.8944 138100 0.0605
0.8950 138200 0.027
0.8957 138300 0.0587
0.8963 138400 0.0333
0.8970 138500 0.0127
0.8976 138600 0.0128
0.8983 138700 0.018
0.8989 138800 0.0153
0.8996 138900 0.052
0.9002 139000 0.0761
0.9009 139100 0.0164
0.9015 139200 0.016
0.9022 139300 0.0114
0.9028 139400 0.0104
0.9035 139500 0.0029
0.9041 139600 0.0255
0.9048 139700 0.0191
0.9054 139800 0.0082
0.9060 139900 0.0069
0.9067 140000 0.0245
0.9073 140100 0.019
0.9080 140200 0.0329
0.9086 140300 0.0025
0.9093 140400 0.0032
0.9099 140500 0.0314
0.9106 140600 0.0419
0.9112 140700 0.002
0.9119 140800 0.0084
0.9125 140900 0.0049
0.9132 141000 0.01
0.9138 141100 0.0058
0.9145 141200 0.0055
0.9151 141300 0.0072
0.9158 141400 0.0091
0.9164 141500 0.0083
0.9171 141600 0.0158
0.9177 141700 0.0858
0.9184 141800 0.0121
0.9190 141900 0.0047
0.9196 142000 0.0066
0.9203 142100 0.0304
0.9209 142200 0.0332
0.9216 142300 0.0113
0.9222 142400 0.0221
0.9229 142500 0.0087
0.9235 142600 0.0419
0.9242 142700 0.006
0.9248 142800 0.0535
0.9255 142900 0.0148
0.9261 143000 0.0417
0.9268 143100 0.0076
0.9274 143200 0.0092
0.9281 143300 0.0059
0.9287 143400 0.0342
0.9294 143500 0.0093
0.9300 143600 0.0052
0.9307 143700 0.0399
0.9313 143800 0.0719
0.9320 143900 0.0238
0.9326 144000 0.0323
0.9332 144100 0.0365
0.9339 144200 0.0148
0.9345 144300 0.0037
0.9352 144400 0.0108
0.9358 144500 0.0926
0.9365 144600 0.0294
0.9371 144700 0.0134
0.9378 144800 0.0162
0.9384 144900 0.0264
0.9391 145000 0.0613
0.9397 145100 0.0688
0.9404 145200 0.0074
0.9410 145300 0.0043
0.9417 145400 0.033
0.9423 145500 0.0037
0.9430 145600 0.0701
0.9436 145700 0.03
0.9443 145800 0.0117
0.9449 145900 0.0134
0.9456 146000 0.0066
0.9462 146100 0.0264
0.9468 146200 0.0099
0.9475 146300 0.0151
0.9481 146400 0.0625
0.9488 146500 0.0339
0.9494 146600 0.0135
0.9501 146700 0.008
0.9507 146800 0.0153
0.9514 146900 0.0097
0.9520 147000 0.0341
0.9527 147100 0.0506
0.9533 147200 0.0162
0.9540 147300 0.0021
0.9546 147400 0.0064
0.9553 147500 0.0258
0.9559 147600 0.071
0.9566 147700 0.0175
0.9572 147800 0.0338
0.9579 147900 0.0129
0.9585 148000 0.0022
0.9592 148100 0.087
0.9598 148200 0.0358
0.9604 148300 0.0083
0.9611 148400 0.0323
0.9617 148500 0.0416
0.9624 148600 0.0589
0.9630 148700 0.0175
0.9637 148800 0.0219
0.9643 148900 0.0249
0.9650 149000 0.0092
0.9656 149100 0.0646
0.9663 149200 0.0184
0.9669 149300 0.0307
0.9676 149400 0.0616
0.9682 149500 0.0108
0.9689 149600 0.0482
0.9695 149700 0.0239
0.9702 149800 0.0024
0.9708 149900 0.0043
0.9715 150000 0.0049
0.9721 150100 0.0136
0.9728 150200 0.0363
0.9734 150300 0.0695
0.9740 150400 0.0527
0.9747 150500 0.028
0.9753 150600 0.0184
0.9760 150700 0.0257
0.9766 150800 0.0074
0.9773 150900 0.023
0.9779 151000 0.0862
0.9786 151100 0.0469
0.9792 151200 0.0314
0.9799 151300 0.0153
0.9805 151400 0.1128
0.9812 151500 0.0119
0.9818 151600 0.0221
0.9825 151700 0.0094
0.9831 151800 0.0144
0.9838 151900 0.0209
0.9844 152000 0.0267
0.9851 152100 0.0339
0.9857 152200 0.0133
0.9864 152300 0.0515
0.9870 152400 0.0223
0.9876 152500 0.0032
0.9883 152600 0.0116
0.9889 152700 0.0292
0.9896 152800 0.0118
0.9902 152900 0.0709
0.9909 153000 0.0305
0.9915 153100 0.0298
0.9922 153200 0.0297
0.9928 153300 0.0049
0.9935 153400 0.0045
0.9941 153500 0.0211
0.9948 153600 0.024
0.9954 153700 0.0268
0.9961 153800 0.0271
0.9967 153900 0.0425
0.9974 154000 0.0395
0.9980 154100 0.067
0.9987 154200 0.0613
0.9993 154300 0.0051
1.0000 154400 0.0136

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 5.1.0
  • Transformers: 4.55.0
  • PyTorch: 2.7.1+cu118
  • Accelerate: 1.10.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
19
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Remonatef/all-MiniLM-L6-v37-pair_score

Dataset used to train Remonatef/all-MiniLM-L6-v37-pair_score

Paper for Remonatef/all-MiniLM-L6-v37-pair_score