Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
10
This is a sentence-transformers model finetuned from google/embeddinggemma-300m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 2048, 'do_lower_case': False, 'architecture': 'Gemma3TextModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 3072, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
(3): Dense({'in_features': 3072, 'out_features': 768, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
(4): Normalize()
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("ndsanjana/embedgemma_ns")
# Run inference
queries = [
"Theme: Ethics of de\u2011extinction and scientific responsibility, Human ambition versus natural limits, Emergence of higher intelligence in extinct species, Corporate militarization of biological research, Coexistence and harmony between ancient and modern life forms",
]
documents = [
'Theme: The ethical limits of scientific ambition, The moral implications of resurrecting extinct species, The clash between corporate exploitation and scientific integrity, The unexpected cognitive complexity of prehistoric life, The possibility of coexistence between past and present ecosystems',
'Actions: Dr. Sarah Chen extracts viable DNA from a Triceratops fossil. -> She creates the first living dinosaur in 65 million years, nicknamed Trinity. -> The creature is publicly revealed, sparking global debate on de‑extinction ethics. -> Trinity exhibits unexpected higher intelligence. -> Biotech magnate Marcus Voss attempts to weaponize the research for military use. -> A confrontation occurs at the research facility. -> Trinity escapes into the nearby wilderness and encounters modern wildlife. -> Dr. Chen decides to destroy her research data to prevent further exploitation. -> Trinity disappears into a remote forest preserve. -> Final scene shows Trinity peacefully coexisting with a herd of elk.',
'85_theme_vs_action',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.7758, 0.1831, -0.0576]])
anchor, positive, negative, triplet_id, and source| anchor | positive | negative | triplet_id | source | |
|---|---|---|---|---|---|
| type | string | string | string | string | string |
| details |
|
|
|
|
|
| anchor | positive | negative | triplet_id | source |
|---|---|---|---|---|
Theme: inheritance, haunted house, supernatural, grief and loss, revenge, family dynamics, possession, exorcism, unresolved trauma, moral choice |
Theme: Inheritance of legacy and the weight of family history, Supernatural haunting as a manifestation of unresolved trauma, The conflict between self-preservation and compassion, The cyclical nature of guilt and the desire for redemption, The tension between rational action and inexplicable forces |
Theme: grief and avoidance, emotional healing, isolation and its psychological effects, responsibility toward family, the interplay between scientific curiosity and personal emotion, the reflective power of nature, guilt and unresolved conflict |
0_theme_cross |
unknown |
Actions: Family moves into inherited Victorian mansion -> Strange occurrences begin immediately -> Teenage daughter becomes primary target of supernatural activity -> Family researches property’s past and learns about reclusive widow and lost daughter -> Paranormal events intensify, threatening family safety -> Father attempts exorcism using items from hidden basement -> Exorcism angers the entity further -> Mother faces a critical choice: flee or help the spirit find peace by reuniting her with her daughter's remains -> Mother chooses to help the spirit |
Actions: A newlywed couple inherits a sprawling ranch house in the desert from an estranged uncle. -> From the first night, bizarre phenomena (whispers, self-opening doors, sudden cold rooms) plague the household. -> The wife becomes the focal point of the disturbances, experiencing terrifying visions and speaking in unfamiliar voices. -> The couple investigates the property's history and learns that the former owner, an elderly hermit, died under suspicious circumstances after his young son accidentally died on the grounds. -> They discover that the hermit's ghost is desperately seeking someone to take his boy's place. -> The husband attempts to banish the spirit using ritual objects found in a concealed cellar. -> The ritual backfires, provoking the entity to greater violence and intensifying the supernatural assault. -> During the final confrontation, the wife faces an impossible decision: escape with her husband to safety or help the anguished ghost locate his son's hidden grave to... |
Actions: Marine biologist accepts a research position at an isolated underwater station studying deep‑sea thermal vents. -> She leaves behind her estranged teenage son, who blames her for his father's recent death. -> During her six‑month assignment she discovers unusual bioluminescent organisms that respond to human emotions and memories. -> She spends more time observing the creatures, which triggers vivid recollections of her late husband and the unresolved guilt surrounding their final argument before his fatal accident. -> The organisms feed on her emotional energy, growing brighter and more active as her psychological state deteriorates. -> Her research partner becomes concerned about her erratic behavior and threatens to abort the mission. -> She realizes that her obsession with the creatures is a way of avoiding her grief and responsibility to her son. -> In the final act, she chooses to surface early and return home, accepting that healing requires facing her loss rather than ... |
0_action_cross |
unknown |
Outcomes: The mother’s decision to reunite the widow’s daughter’s remains brings peace to the spirit, ending the haunting. The family remains safe and can continue living in the house. |
Outcomes: The story concludes with the wife's decision, leaving the haunting either unresolved if they escape or potentially resolved if they help the ghost find the grave. The final state is ambiguous, reflecting the unresolved tension between survival and compassion. |
Outcomes: She returns home, confronts her grief and responsibility toward her son, and begins the process of healing. |
0_outcome_cross |
unknown |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim",
"gather_across_devices": false
}
learning_rate: 2e-05num_train_epochs: 10warmup_ratio: 0.1fp16: Trueprompts: task: sentence similarity | query:overwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 8per_device_eval_batch_size: 8per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 2e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 10max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torch_fusedoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthproject: huggingfacetrackio_space_id: trackioddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: noneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Trueprompts: task: sentence similarity | query: batch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}| Epoch | Step | Training Loss |
|---|---|---|
| 0.8 | 100 | 0.0664 |
| 1.6 | 200 | 0.017 |
| 2.4 | 300 | 0.018 |
| 3.2 | 400 | 0.005 |
| 4.0 | 500 | 0.026 |
| 4.8 | 600 | 0.0119 |
| 5.6 | 700 | 0.0083 |
| 6.4 | 800 | 0.0198 |
| 7.2 | 900 | 0.0217 |
| 8.0 | 1000 | 0.0123 |
| 8.8 | 1100 | 0.0174 |
| 9.6 | 1200 | 0.0112 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Base model
google/embeddinggemma-300m