Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
12
This is a sentence-transformers model finetuned from sucharush/e5_stem_finetuned. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Stergios-Konstantinidis/MNLP_M3_tokenizer_tuned_pos_neg_2")
# Run inference
sentences = [
"Question: who won the 2002 us open men's tennis championship, Answer: 2002 US Open (tennis) Both Lleyton Hewitt and Venus Williams were unsuccessful in their title defences, Hewitt being defeated in the semifinals by Andre Agassi and Venus being defeated in the final by her younger sister Serena. It was the third of four consecutive Grand Slam titles won by Serena, all won by defeating Venus in the final. Pete Sampras, runner-up in 2001, won his fifth US Open title, and his 14th and final Grand Slam title, defeating his rival Agassi in the final in four sets.",
'Question: who played the artful dodger in the original oliver, Answer: Jack Wild Jack Wild (30 September 1952 – 1 March 2006) was an English actor and singer, best known for his debut role as the Artful Dodger in Oliver!, (1968) for which he received an Academy Award nomination for Best Supporting Actor as well as Golden Globe and BAFTA nominations.',
'Question: what do the pilgrims have to do with thanksgiving, Answer: Thanksgiving (United States) The event that Americans commonly call the "First Thanksgiving" was celebrated by the Pilgrims after their first harvest in the New World in October 1621.[5] This feast lasted three days, and—as accounted by attendee Edward Winslow[6]—it was attended by 90 Native Americans and 53 Pilgrims.[7] The New England colonists were accustomed to regularly celebrating "thanksgivings"—days of prayer thanking God for blessings such as military victory or the end of a drought.[8]',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
sentence_0, sentence_1, and label| sentence_0 | sentence_1 | label | |
|---|---|---|---|
| type | string | string | int |
| details |
|
|
|
| sentence_0 | sentence_1 | label |
|---|---|---|
[{'from': 'human', 'value': 'How do Cepheid variables serve as tools for measuring astronomical distances?'}, {'from': 'gpt', 'value': 'Cepheid variables are stars that exhibit a direct relationship between their pulsation period and intrinsic luminosity. This property allows them to function as "standard candles" in astronomy. By observing the pulsation period of a Cepheid variable, we can determine its luminosity. Since the brightness of a star decreases with increasing distance, we can use the known luminosity of Cepheid variables to calculate the distance to these stars and, by extension, other celestial objects. This method is crucial for estimating distances within our galaxy and even in neighboring galaxies.'}] |
[{'from': 'human', 'value': 'How do Cepheid variables serve as tools for measuring astronomical distances?'}, {'from': 'gpt', 'value': 'Cepheid variables are stars that exhibit a direct relationship between their pulsation period and intrinsic luminosity. This property allows them to function as "standard candles" in astronomy. By observing the pulsation period of a Cepheid variable, we can determine its luminosity. Since the brightness of a star decreases with increasing distance, we can use the known luminosity of Cepheid variables to calculate the distance to these stars and, by extension, other celestial objects. This method is crucial for estimating distances within our galaxy and even in neighboring galaxies.'}] |
1 |
[{'from': 'human', 'value': 'If a current of 5 amperes (A) passing through a circuit generates 435 watts (W) of power, what is the resistance of the circuit?'}, {'from': 'gpt', 'value': 'The formula for power (P) in an electrical circuit is:\n\nP = I^2 * R\n\nwhere:\n\nP is power in watts (W)\nI is current in amperes (A)\nR is resistance in ohms (Ω)\n\nWe are given that the current (I) is 5 A and the power (P) is 435 W. We can use the formula to find the resistance (R):\n\nR = P / I^2\nR = 435 W / 5 A^2\nR = 17.4 Ω\n\nTherefore, the resistance of the circuit is 17.4 ohms.\n####\nThe resistance of the circuit is 17.4 ohms (Ω).'}] |
Question: where does the last name anthony come from, Answer: Anthony (given name) Anthony or Antony is a masculine given name, derived from the Antonii, a gens (Roman family name) to which Mark Antony (Marcus Antonius) belonged. There was an ancient tradition that the Antonii gens were Heracleidae, being descendants of Anton, a son of Heracles. (Plut. Ant. 4) |
0 |
Question: where did skin in the game come from, Answer: Skin in the game (phrase) The origin of the phrase is unknown.[1] |
Question: where did skin in the game come from, Answer: Skin in the game (phrase) The origin of the phrase is unknown.[1] |
1 |
ContrastiveTensionLossper_device_train_batch_size: 14per_device_eval_batch_size: 14num_train_epochs: 1multi_dataset_batch_sampler: round_robinoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 14per_device_eval_batch_size: 14per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin| Epoch | Step | Training Loss |
|---|---|---|
| 0.1050 | 500 | 11.0503 |
| 0.2100 | 1000 | 10.1883 |
| 0.3151 | 1500 | 9.7122 |
| 0.4201 | 2000 | 9.71 |
| 0.5251 | 2500 | 9.7154 |
| 0.6301 | 3000 | 9.7117 |
| 0.7351 | 3500 | 9.706 |
| 0.8402 | 4000 | 9.7122 |
| 0.9452 | 4500 | 9.7079 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@inproceedings{carlsson2021semantic,
title={Semantic Re-tuning with Contrastive Tension},
author={Fredrik Carlsson and Amaru Cuba Gyllensten and Evangelia Gogoulou and Erik Ylip{"a}{"a} Hellqvist and Magnus Sahlgren},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=Ov_sMNau-PF}
}
Base model
sucharush/e5_stem_finetuned