Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper • 1908.10084 • Published • 12
This is a sentence-transformers model finetuned from dangvantuan/vietnamese-embedding. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'RobertaModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Ông trùm nắm chìa_khoá 5.000 tỷ USD : Nhân_vật 2025 ? . ( Dân_trí ) - Nhận cuộc_gọi Trump nửa_đêm nắm đế_chế 5.000 tỷ USD , Jensen_Huang Nhân_vật 2025 . Bí_mật ẩn áo da quyền_lực canh_bạc AI tất ?',
'Đưa_Nvidia chạm mốc 4.200 tỷ USD , CEO Jensen_Huang “ cày ” cỡ ? . ( Dân_trí ) - Để Nvidia thành công_ty đắt_giá thế_giới , Jensen_Huang đánh_đổi đời_sống thường_nhật : Không phim_ảnh , nghỉ , bộ_não “ tắt ” .',
'Nợ 4,4 tỷ đồng , cụ U90 Trung_Quốc ròng_rã may áo suốt 10 . ( Dân_trí ) - Cụ Chen_Jinying cảm_phục ý_chí nghị_lực ròng_rã áo nợ suốt 10 .',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.6855, 0.1530],
# [0.6855, 1.0000, 0.1226],
# [0.1530, 0.1226, 1.0000]])
ir_valInformationRetrievalEvaluator| Metric | Value |
|---|---|
| cosine_accuracy@1 | 0.0005 |
| cosine_accuracy@3 | 0.4075 |
| cosine_accuracy@5 | 0.558 |
| cosine_accuracy@10 | 0.7111 |
| cosine_precision@1 | 0.0005 |
| cosine_precision@3 | 0.1637 |
| cosine_precision@5 | 0.1598 |
| cosine_precision@10 | 0.1217 |
| cosine_recall@1 | 0.0001 |
| cosine_recall@3 | 0.2533 |
| cosine_recall@5 | 0.3939 |
| cosine_recall@10 | 0.5643 |
| cosine_ndcg@10 | 0.3111 |
| cosine_mrr@10 | 0.2362 |
| cosine_map@10 | 0.1989 |
sentence_0 and sentence_1| sentence_0 | sentence_1 | |
|---|---|---|
| type | string | string |
| details |
|
|
| sentence_0 | sentence_1 |
|---|---|
Đạo_diễn , NSND Thanh_Vân cay_đắng Hãng phim_truyện Việt_Nam . ( Dân_trí ) - Trước thảm_cảnh Hãng phim_truyện Việt_Nam , đạo_diễn Thanh_Vân PV Dân_trí : Tôi cay_đắng 30 tận_tâm cống_hiến , Hãng , nằm viện 100% viện_phí . |
Đạo_diễn Luk_Vân áp_lực phim Công nữ Ngọc_Hoa . ( Dân_trí ) - Ngày 22/11 , đạo_diễn Luk_Vân công_bố dự_án Công nữ Ngọc_Hoa , phim_điện_ảnh hợp_tác Việt - Nhật . Ý_tưởng phim tình đẹp Công nữ Ngọc_Hoa thương_nhân Nhật_Bản Araki_Sotaro . |
Bộ_trưởng Tài_chính : Việt_Nam cố_gắng đáp_ứng tiêu_chí nâng hạng FTSE . Việt_Nam nỗ_lực đáp_ứng tiêu_chí nâng hạng FTSE thông_qua cải_cách thuận_lợi dòng vốn đầu_tư nước_ngoài thị_trường , Bộ_trưởng Nguyễn_Văn_Thắng . |
Chứng_khoán . VN-Index giằng_co quyết_liệt giao_dịch đột_ngột rớt 10 phiên khớp lệnh xác_định giá đóng_cửa . |
Nhóm BTOB tái_ngộ khán_giả Việt . Nhóm nhạc Hàn_Quốc BTOB trở_lại sân_khấu TP HCM bảy , khuấy_động không_khí loạt hit , tối 31/10 . |
Hanbin - chàng trai Việt toả thần_tượng Kpop . Hanbin ( Ngô_Ngọc_Hưng ) - 26 , quê Yên_Bái - fan đông_đảo nhạc Hàn_TEMPEST giọng hát , vũ_đạo nổi_bật . |
MultipleNegativesRankingLoss with these parameters:{
"scale": 20.0,
"similarity_fct": "cos_sim",
"gather_across_devices": false
}
eval_strategy: stepsper_device_train_batch_size: 32per_device_eval_batch_size: 32num_train_epochs: 5fp16: Truemulti_dataset_batch_sampler: round_robinoverwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 32per_device_eval_batch_size: 32per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 5max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}parallelism_config: Nonedeepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torch_fusedoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthproject: huggingfacetrackio_space_id: trackioddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters: auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: noneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Trueprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robinrouter_mapping: {}learning_rate_mapping: {}| Epoch | Step | Training Loss | ir_val_cosine_ndcg@10 |
|---|---|---|---|
| 0.0808 | 50 | - | 0.1715 |
| 0.1616 | 100 | - | 0.1863 |
| 0.2423 | 150 | - | 0.1975 |
| 0.3231 | 200 | - | 0.2170 |
| 0.4039 | 250 | - | 0.2236 |
| 0.4847 | 300 | - | 0.2303 |
| 0.5654 | 350 | - | 0.2344 |
| 0.6462 | 400 | - | 0.2353 |
| 0.7270 | 450 | - | 0.2371 |
| 0.8078 | 500 | 0.8206 | 0.2399 |
| 0.8885 | 550 | - | 0.2423 |
| 0.9693 | 600 | - | 0.2440 |
| 1.0 | 619 | - | 0.2469 |
| 1.0501 | 650 | - | 0.2485 |
| 1.1309 | 700 | - | 0.2504 |
| 1.2116 | 750 | - | 0.2542 |
| 1.2924 | 800 | - | 0.2582 |
| 1.3732 | 850 | - | 0.2538 |
| 1.4540 | 900 | - | 0.2586 |
| 1.5347 | 950 | - | 0.2612 |
| 1.6155 | 1000 | 0.4315 | 0.2613 |
| 1.6963 | 1050 | - | 0.2608 |
| 1.7771 | 1100 | - | 0.2625 |
| 1.8578 | 1150 | - | 0.2658 |
| 1.9386 | 1200 | - | 0.2674 |
| 2.0 | 1238 | - | 0.2675 |
| 2.0194 | 1250 | - | 0.2695 |
| 2.1002 | 1300 | - | 0.2730 |
| 2.1809 | 1350 | - | 0.2745 |
| 2.2617 | 1400 | - | 0.2772 |
| 2.3425 | 1450 | - | 0.2800 |
| 2.4233 | 1500 | 0.3189 | 0.2783 |
| 2.5040 | 1550 | - | 0.2793 |
| 2.5848 | 1600 | - | 0.2810 |
| 2.6656 | 1650 | - | 0.2804 |
| 2.7464 | 1700 | - | 0.2821 |
| 2.8271 | 1750 | - | 0.2850 |
| 2.9079 | 1800 | - | 0.2846 |
| 2.9887 | 1850 | - | 0.2857 |
| 3.0 | 1857 | - | 0.2850 |
| 3.0695 | 1900 | - | 0.2874 |
| 3.1502 | 1950 | - | 0.2869 |
| 3.2310 | 2000 | 0.2421 | 0.2878 |
| 3.3118 | 2050 | - | 0.2894 |
| 3.3926 | 2100 | - | 0.2932 |
| 3.4733 | 2150 | - | 0.2959 |
| 3.5541 | 2200 | - | 0.2954 |
| 3.6349 | 2250 | - | 0.2951 |
| 3.7157 | 2300 | - | 0.2986 |
| 3.7964 | 2350 | - | 0.3013 |
| 3.8772 | 2400 | - | 0.2980 |
| 3.9580 | 2450 | - | 0.2992 |
| 4.0 | 2476 | - | 0.3006 |
| 4.0388 | 2500 | 0.2048 | 0.3005 |
| 4.1195 | 2550 | - | 0.3019 |
| 4.2003 | 2600 | - | 0.3037 |
| 4.2811 | 2650 | - | 0.3038 |
| 4.3619 | 2700 | - | 0.3045 |
| 4.4426 | 2750 | - | 0.3068 |
| 4.5234 | 2800 | - | 0.3087 |
| 4.6042 | 2850 | - | 0.3074 |
| 4.6850 | 2900 | - | 0.3082 |
| 4.7658 | 2950 | - | 0.3078 |
| 4.8465 | 3000 | 0.1815 | 0.3086 |
| 4.9273 | 3050 | - | 0.3076 |
| 5.0 | 3095 | - | 0.3111 |
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Base model
dangvantuan/vietnamese-embedding