Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
• 1908.10084 • Published
• 12
This is a sentence-transformers model finetuned from sucharush/e5_stem_finetuned. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Stergios-Konstantinidis/MNLP_M3_tokenizer_tuned_pos_neg_3")
# Run inference
sentences = [
"Question: who played granny on the beverly hillbillies tv show, Answer: Irene Ryan Ryan is most widely known for her portrayal of Granny, the mother-in-law of Buddy Ebsen's character, on the long-running TV series The Beverly Hillbillies (1962–1971), for which she was nominated for Emmy Awards for Outstanding Lead Actress in a Comedy Series in 1963 and 1964.",
'[{\'from\': \'human\', \'value\': \'Solve the following math problem step-by-step.\\nSimplify your answer as much as possible. Present your final answer as \\\\boxed{Your Answer}.\\nHenley bought 300 candies and shared them with her two brothers. However, they realized 40% of them were sour, and they had to separate those out. If they then shared the good candies equally, how many candies did each get?\'}, {\'from\': \'gpt\', \'value\': "Step 1: First, let\'s calculate the number of sour candies. sour_candies = 300 * 0.4 = 120\\nStep 2: To find the number of good candies, we subtract the number of sour candies from the total number of candies. good_candies = 300 - 120 = 180\\nStep 3: Since Henley shared the candies equally with her two brothers, we need to divide the number of good candies by 3 (Henley and her two brothers). candies_per_person = good_candies / 3 = 180 / 3 = 60\\n\\nAnswer:\\n\\\\boxed{60}"}]',
'[{\'from\': \'human\', \'value\': \'Solve the following math problem step-by-step.\\nSimplify your answer as much as possible. Present your final answer as \\\\boxed{Your Answer}.\\nThe bowl of fruit contains apples, pears, and bananas. There are two more pears than apples, and three more bananas than pears. If the bowl contains 19 pieces of fruit, how many bananas does it contain?\'}, {\'from\': \'gpt\', \'value\': "Step 1: Let\'s denote the number of apples as A, the number of pears as P, and the number of bananas as B.\\nStep 2: According to the given information, there are two more pears than apples, so P = A + 2.\\nStep 3: Also, there are three more bananas than pears, so B = P + 3.\\nStep 4: The total number of fruits in the bowl is given as 19, so A + P + B = 19.\\nStep 5: Substituting the values of P and B from steps 2 and 3 into the equation in step 4, we get A + (A + 2) + ((A + 2) + 3) = 19.\\nStep 6: Simplifying the equation, 3A + 7 = 19.\\nStep 7: Solving for A, 3A = 12 => A = 4.\\nStep 8: Substituting the value of A back into the equations in steps 2 and 3, we get P = 4 + 2 = 6 and B = 6 + 3 = 9.\\nStep 9: Therefore, the bowl contains 9 bananas.\\n\\nAnswer:\\n\\\\boxed{9}"}]',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
sentence_0, sentence_1, and label| sentence_0 | sentence_1 | label | |
|---|---|---|---|
| type | string | string | int |
| details |
|
|
|
| sentence_0 | sentence_1 | label |
|---|---|---|
Question: where is nigeria located in the map of africa, Answer: Nigeria The Federal Republic of Nigeria /naɪˈdʒɪəriə/ ( listen), commonly referred to as Nigeria, is a federal republic in West Africa, bordering Benin in the west, Chad and Cameroon in the east, and Niger in the north. Its coast in the south lies on the Gulf of Guinea in the Atlantic Ocean. It comprises 36 states and the Federal Capital Territory, where the capital, Abuja is located. Nigeria is officially a democratic secular country.[6] |
Question: where is nigeria located in the map of africa, Answer: Nigeria The Federal Republic of Nigeria /naɪˈdʒɪəriə/ ( listen), commonly referred to as Nigeria, is a federal republic in West Africa, bordering Benin in the west, Chad and Cameroon in the east, and Niger in the north. Its coast in the south lies on the Gulf of Guinea in the Atlantic Ocean. It comprises 36 states and the Federal Capital Territory, where the capital, Abuja is located. Nigeria is officially a democratic secular country.[6] |
1 |
Question: who was the last person to live in biltmore house, Answer: Biltmore Estate After the divorce of the Cecils in 1934, Cornelia left the estate never to return; however, John Cecil maintained his residence in the Bachelors' Wing until his death in 1954. Their eldest son George Henry Vanderbilt Cecil, occupied rooms in the wing until 1956. At this point Biltmore House ceased to be a family residence and has continued to be operated as a historic house museum. |
[{'from': 'human', 'value': 'What is the charge associated with the electron cloud, and how does it relate to the overall charge of an atom?'}, {'from': 'gpt', 'value': "The electron cloud in an atom carries a negative charge. This charge is balanced by the positive charge of the atomic nucleus, which results in a neutral atom when both charges are equal. If there is an excess or deficiency of electrons, the atom becomes an ion and carries a net positive or negative charge.\n\nIt is worth noting that the historical convention of assigning electrons a negative charge can sometimes lead to confusion. If we were to start anew, it might be logical to designate electrons as positively charged and the nucleus as negatively charged. However, it's important to remember this convention when performing electronic calculations to avoid errors due to incorrect charge assignments."}] |
0 |
[{'from': 'human', 'value': 'Solve the following math problem step-by-step.\nSimplify your answer as much as possible. Present your final answer as \boxed{Your Answer}.\nJames buys jars for his honey. He has 5 hives that each produce 20 liters of honey. Each jar can hold .5 liters. How many jars will he need to buy if his friend is bringing his own jars for half the honey?'}, {'from': 'gpt', 'value': "Step 1: First, we need to determine the total amount of honey produced by James' 5 hives. Each hive produces 20 liters of honey, so the total honey produced is 5 * 20 = 100 liters.\nStep 2: Since his friend is bringing his own jars for half of the honey, James will only keep half of the total honey produced, which is 100 / 2 = 50 liters.\nStep 3: Now, we need to find out how many jars James needs to hold 50 liters of honey, given that each jar can hold 0.5 liters.\nStep 4: To calculate the number of jars needed, we can simply divide the total amount of honey (50 liters) by the cap... |
Question: what does ds stand for in uk police, Answer: Criminal investigation department In the United Kingdom, smaller police stations usually have more uniformed officers than CID officers, typically five DCs with a detective sergeant (DS) in overall command. In larger stations many DCs, DSs and detective inspectors are present under the overall responsibility of a detective chief inspector. |
0 |
ContrastiveTensionLossper_device_train_batch_size: 15per_device_eval_batch_size: 15num_train_epochs: 1multi_dataset_batch_sampler: round_robinoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 15per_device_eval_batch_size: 15per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin| Epoch | Step | Training Loss |
|---|---|---|
| 0.125 | 500 | 13.9952 |
| 0.25 | 1000 | 11.0821 |
| 0.375 | 1500 | 9.5046 |
| 0.5 | 2000 | 9.6757 |
| 0.625 | 2500 | 9.4813 |
| 0.75 | 3000 | 9.522 |
| 0.875 | 3500 | 9.6432 |
| 1.0 | 4000 | 9.6282 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@inproceedings{carlsson2021semantic,
title={Semantic Re-tuning with Contrastive Tension},
author={Fredrik Carlsson and Amaru Cuba Gyllensten and Evangelia Gogoulou and Erik Ylip{"a}{"a} Hellqvist and Magnus Sahlgren},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=Ov_sMNau-PF}
}
Base model
sucharush/e5_stem_finetuned