Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
12
This is a sentence-transformers model finetuned from sentence-transformers/distiluse-base-multilingual-cased-v1. It maps sentences & paragraphs to a 512-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 512, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'EDUCATION\nBA in Management Information Systems\nDuy Tan University (2021 - Expected completion 05/2025)\nGPA: 3.6/4.0\nTECHNICAL SKILLS\nFrontend:\n\nLanguages ??& Frameworks: HTML, CSS, JavaScript, TypeScript\nLibraries & Tools: TailwindCSS, React.js, Next.js\nBackend:\n\nMain Framework: NestJS, ExpressJS\nDatabase & ORM: PostgreSQL, MongoDB, TypeOrm, Mongoose\nCloud Services: AWS, Elasticsearch\n\nMessaging & Streaming: KafkaJS, WebSocket\n\nContainer & Deployment: Docker\n\nOther Tools:\n\nVersion Control: Git, GitHub\nCI/CD: Vercel\nPERSONAL PROJECTS\nNestgres\nGitHub Repository: Learning Project with NestJS and PostgreSQL. Integrate AWS S3 for file storage, Docker for containerization, and Elasticsearch for advanced search. This project helped me gain a deep understanding of building a scalable backend system and handling big data.\n\nSimple Todo\nLive Demo: A simple to-do list application to reinforce my knowledge of React, including component architecture, state management, and efficient rendering.\n\nNestactube\nGitHub Repository: A full-stack video platform combining NestJS and React, serving video streaming from backend to frontend. Focusing on handling large media files and ensuring smooth video playback.\n\nMindForge\nGitHub Repository - Live Demo: Clone of Notion with note creation and editing features. Using Convex for secure data storage and Clerk for user authentication. The project helped me develop a friendly interface and handle complex data structures.\n\nAWARDS & ACHIEVEMENTS\nExcellent Academic Performance - 2022, 2023\nBoeing Scholarship - 2022, 2023\nThird Prize - Duy Tan Informatics Competition, 2023\nThird Prize - Informatics Olympiad (non-specialist group), 2023\nCERTIFICATIONS\nFoundations of User Experience (UX) Design',
"Skills and Qualifications Required:\n\nQualifications:\n\nBachelor's degree in Computer Science, Computer Networking or related fields from Universities such as University of Science, University of Natural Sciences, University of Information Technology, Vietnam National University, Ho Chi Minh City, or Vietnam National University, Hanoi.\nOnly candidates with academic background and practical experience directly related to Information Technology will be considered (candidates from short-term or non-major programs will not be accepted).\nGrade Point Average (GPA):\n\nMinimum GPA: 7.0 (on a scale of 10) or 2.8 (on a scale of 4)\n\nTechnical Skills:\n\nStrong programming skills with C/C++, along with knowledge of object-oriented programming.\nHave practical experience (1 year or more) working with C/C++.\nBasic knowledge of operating systems such as Windows, Linux, and MacOS.\nUnderstanding of network protocols and security principles. Strong team working skills and problem solving ability.",
"Qualifications:\n\nExpected Skills:\n\nProgramming Languages: Proficiency in C# and experience working with Unity3D.\nGame Development: Solid understanding of game mechanics, UI/UX, and physics.\nCybersecurity Tools: Experience with cyber security tools with a defensive/offensive mindset.\nPerformance Optimization: Solid skills in game optimization and memory management.\nVersion Management Knowledge: Familiarity with Git or similar version control systems.\nPreferred Skills:\n\nAR/VR development experience is a plus.\nKnowledge of multi-player and networking concepts, along with creativity.\nEducation:\n\nBachelor's degree in Computer Science, Game Development, or related field, or equivalent hands-on experience.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 512]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
sentence_0, sentence_1, and label| sentence_0 | sentence_1 | label | |
|---|---|---|---|
| type | string | string | float |
| details |
|
|
|
| sentence_0 | sentence_1 | label |
|---|---|---|
EDUCATION |
Professional Qualifications: |
0.65 |
EDUCATION HISTORY University of Roseton Master of Science in Software Engineering |
Graduated: 2020 Best Thesis Awardee Berou Solutions Scholarship Recipient De Loureigh University Bachelor of Science in Computer Science | Graduated: 2016 (Cum Laude) Founder of DLU Programming Club Hackathon Champion Beechtown 2015 RELEVANT SKILLS Programming Languages: JavaScript, C/C++, Java, Python, Kotlin, Go Core Skills: Problem Solving, Team Communication |
EDUCATION Bachelor of Science in Computer Science Rutgers University ? New Brunswick, NJ 2008 - 2012 SKILLS HTML CSS JavaScript React jQuery Angular.js Vue.js Enzyme Jest Git |
ABBANK |
0.62 |
CosineSimilarityLoss with these parameters:{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
per_device_train_batch_size: 16per_device_eval_batch_size: 16multi_dataset_batch_sampler: round_robinoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 16per_device_eval_batch_size: 16per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 3max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Falsehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseeval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseeval_use_gather_object: Falsebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin| Epoch | Step | Training Loss |
|---|---|---|
| 1.5625 | 500 | 0.0023 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}