SentenceTransformer based on vinai/phobert-base-v2

This is a sentence-transformers model finetuned from vinai/phobert-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: vinai/phobert-base-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: RobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("meandyou200175/sp_chatbot_query")
# Run inference
sentences = [
    'cho tôi máy hút bụi công suất hút trên 8kPa và pin chạy ít nhất 40 phút',
    'Robot hút bụi Xiaomi Vacuum X10, lực hút 20000Pa (20kPa), pin 60 phút, Giá: 11.900.000',
    'Lò nướng Sunhouse SHD4260, dung tích 45L, công suất 1600W, Giá 1.150.000',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.2614
cosine_accuracy@2 0.4269
cosine_accuracy@5 0.6739
cosine_accuracy@10 0.8594
cosine_accuracy@100 1.0
cosine_precision@1 0.2614
cosine_precision@2 0.2134
cosine_precision@5 0.1348
cosine_precision@10 0.0859
cosine_precision@100 0.01
cosine_recall@1 0.2614
cosine_recall@2 0.4269
cosine_recall@5 0.6739
cosine_recall@10 0.8594
cosine_recall@100 1.0
cosine_ndcg@10 0.5371
cosine_mrr@1 0.2614
cosine_mrr@2 0.3441
cosine_mrr@5 0.412
cosine_mrr@10 0.4367
cosine_mrr@100 0.4452
cosine_map@100 0.4452

Training Details

Training Dataset

Unnamed Dataset

  • Size: 14,462 training samples
  • Columns: query and positive
  • Approximate statistics based on the first 1000 samples:
    query positive
    type string string
    details
    • min: 5 tokens
    • mean: 14.29 tokens
    • max: 27 tokens
    • min: 17 tokens
    • mean: 35.68 tokens
    • max: 133 tokens
  • Samples:
    query positive
    Có cân điện tử y tế dưới 1.068.000.000 VNĐ không Omron HN-286 - . Cân điện tử sức khỏe, hiển thị cân nặng & BMI, mặt kính chịu lực, pin AA x 2, thiết kế gọn nhẹ, vận hành êm. Kích thước: 300 x 300 x 25 mm. Trọng lượng: 2 kg. Giá: 890.000 VNĐ
    cần nồi cơm điện công suất trên 700W Nồi cơm điện Sharp KS-11ETV, Công suất 750W, Dung tích 1.1L, Giá: 1.050.000
    cho tôi màn hình máy tính kích thước tối thiểu 23 inch Màn hình LG UltraGear 27GN950, 27 inch, 4K UHD, 144Hz, Nano IPS, Giá: 16.800.000
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 1,607 evaluation samples
  • Columns: query and positive
  • Approximate statistics based on the first 1000 samples:
    query positive
    type string string
    details
    • min: 5 tokens
    • mean: 14.33 tokens
    • max: 27 tokens
    • min: 18 tokens
    • mean: 37.23 tokens
    • max: 139 tokens
  • Samples:
    query positive
    mình cần máy lạnh giá trong khoảng 15 đến 20 triệu, công suất trên 12.000 BTU và tiết kiệm điện 5 sao Điều hòa Panasonic Inverter 1.5HP, Công suất 12.700 BTU, Công nghệ NanoeX, Giá: 18.200.000
    tôi muốn mua máy chiếu độ sáng trên 3.500 lumen và giá nhỏ hơn 19 triệu Máy chiếu Epson EB-X51, Độ sáng 3.700 lumen, Độ phân giải XGA, Giá: 14.200.000
    Có bộ đồ trang điểm dưới 1.741.500.000 VNĐ không Sephora Basics Kit - . Bao gồm 12 màu phấn mắt, 4 màu má hồng, 2 màu son, cọ trang điểm, hộp gọn nhẹ, thích hợp đi du lịch, chất liệu an toàn cho da. Kích thước: 300 x 200 x 50 mm. Trọng lượng: 0.8 kg. Giá: 1.290.000 VNĐ
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 2
  • per_device_eval_batch_size: 2
  • learning_rate: 2e-05
  • num_train_epochs: 6
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 2
  • per_device_eval_batch_size: 2
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 6
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss Validation Loss cosine_ndcg@10
-1 -1 - - 0.1506
0.0138 100 0.333 - -
0.0277 200 0.1879 - -
0.0415 300 0.0525 - -
0.0553 400 0.0482 - -
0.0691 500 0.0271 - -
0.0830 600 0.0398 - -
0.0968 700 0.0381 - -
0.1106 800 0.0401 - -
0.1245 900 0.0158 - -
0.1383 1000 0.0251 0.0225 0.3056
0.1521 1100 0.0119 - -
0.1660 1200 0.0133 - -
0.1798 1300 0.0278 - -
0.1936 1400 0.0196 - -
0.2074 1500 0.0008 - -
0.2213 1600 0.0405 - -
0.2351 1700 0.0003 - -
0.2489 1800 0.0034 - -
0.2628 1900 0.0271 - -
0.2766 2000 0.0151 0.0160 0.3355
0.2904 2100 0.0019 - -
0.3042 2200 0.0181 - -
0.3181 2300 0.0218 - -
0.3319 2400 0.0105 - -
0.3457 2500 0.0551 - -
0.3596 2600 0.0279 - -
0.3734 2700 0.0205 - -
0.3872 2800 0.0018 - -
0.4011 2900 0.0047 - -
0.4149 3000 0.018 0.0148 0.3718
0.4287 3100 0.0081 - -
0.4425 3200 0.0145 - -
0.4564 3300 0.0258 - -
0.4702 3400 0.0331 - -
0.4840 3500 0.0122 - -
0.4979 3600 0.0179 - -
0.5117 3700 0.0003 - -
0.5255 3800 0.0223 - -
0.5393 3900 0.0126 - -
0.5532 4000 0.0087 0.0201 0.3609
0.5670 4100 0.0139 - -
0.5808 4200 0.0189 - -
0.5947 4300 0.0062 - -
0.6085 4400 0.0092 - -
0.6223 4500 0.0192 - -
0.6361 4600 0.0568 - -
0.6500 4700 0.0128 - -
0.6638 4800 0.0312 - -
0.6776 4900 0.0961 - -
0.6915 5000 0.0311 0.0093 0.3905
0.7053 5100 0.0176 - -
0.7191 5200 0.0084 - -
0.7330 5300 0.0329 - -
0.7468 5400 0.0015 - -
0.7606 5500 0.0003 - -
0.7744 5600 0.0153 - -
0.7883 5700 0.0077 - -
0.8021 5800 0.0166 - -
0.8159 5900 0.0079 - -
0.8298 6000 0.001 0.0083 0.4171
0.8436 6100 0.0227 - -
0.8574 6200 0.0591 - -
0.8712 6300 0.0115 - -
0.8851 6400 0.0342 - -
0.8989 6500 0.0199 - -
0.9127 6600 0.0067 - -
0.9266 6700 0.0206 - -
0.9404 6800 0.0092 - -
0.9542 6900 0.0002 - -
0.9681 7000 0.0132 0.0113 0.4096
0.9819 7100 0.007 - -
0.9957 7200 0.0001 - -
1.0095 7300 0.0219 - -
1.0234 7400 0.0005 - -
1.0372 7500 0.0246 - -
1.0510 7600 0.0117 - -
1.0649 7700 0.0092 - -
1.0787 7800 0.0004 - -
1.0925 7900 0.0352 - -
1.1063 8000 0.0182 0.0102 0.3950
1.1202 8100 0.0487 - -
1.1340 8200 0.0391 - -
1.1478 8300 0.0197 - -
1.1617 8400 0.0124 - -
1.1755 8500 0.059 - -
1.1893 8600 0.0269 - -
1.2032 8700 0.0004 - -
1.2170 8800 0.0007 - -
1.2308 8900 0.0035 - -
1.2446 9000 0.0056 0.0094 0.4364
1.2585 9100 0.018 - -
1.2723 9200 0.0159 - -
1.2861 9300 0.011 - -
1.3000 9400 0.0222 - -
1.3138 9500 0.0042 - -
1.3276 9600 0.0107 - -
1.3414 9700 0.0271 - -
1.3553 9800 0.0042 - -
1.3691 9900 0.0135 - -
1.3829 10000 0.0099 0.0172 0.4031
1.3968 10100 0.039 - -
1.4106 10200 0.0573 - -
1.4244 10300 0.0411 - -
1.4383 10400 0.0096 - -
1.4521 10500 0.0207 - -
1.4659 10600 0.0141 - -
1.4797 10700 0.0081 - -
1.4936 10800 0.0 - -
1.5074 10900 0.0081 - -
1.5212 11000 0.0166 0.0106 0.4550
1.5351 11100 0.0069 - -
1.5489 11200 0.0103 - -
1.5627 11300 0.016 - -
1.5765 11400 0.0138 - -
1.5904 11500 0.0023 - -
1.6042 11600 0.0005 - -
1.6180 11700 0.0081 - -
1.6319 11800 0.0136 - -
1.6457 11900 0.0147 - -
1.6595 12000 0.0149 0.0121 0.4477
1.6734 12100 0.0143 - -
1.6872 12200 0.0576 - -
1.7010 12300 0.0355 - -
1.7148 12400 0.0021 - -
1.7287 12500 0.0158 - -
1.7425 12600 0.0 - -
1.7563 12700 0.0081 - -
1.7702 12800 0.0012 - -
1.7840 12900 0.0039 - -
1.7978 13000 0.0203 0.0099 0.4580
1.8116 13100 0.0082 - -
1.8255 13200 0.005 - -
1.8393 13300 0.0109 - -
1.8531 13400 0.0002 - -
1.8670 13500 0.0067 - -
1.8808 13600 0.0154 - -
1.8946 13700 0.0021 - -
1.9084 13800 0.0096 - -
1.9223 13900 0.0064 - -
1.9361 14000 0.006 0.0083 0.4691
1.9499 14100 0.0012 - -
1.9638 14200 0.0018 - -
1.9776 14300 0.0339 - -
1.9914 14400 0.0191 - -
2.0053 14500 0.0028 - -
2.0191 14600 0.0068 - -
2.0329 14700 0.0088 - -
2.0467 14800 0.0625 - -
2.0606 14900 0.0131 - -
2.0744 15000 0.0052 0.0090 0.4483
2.0882 15100 0.0002 - -
2.1021 15200 0.0108 - -
2.1159 15300 0.0185 - -
2.1297 15400 0.0002 - -
2.1435 15500 0.0192 - -
2.1574 15600 0.0082 - -
2.1712 15700 0.0006 - -
2.1850 15800 0.0095 - -
2.1989 15900 0.0001 - -
2.2127 16000 0.0136 0.0077 0.4718
2.2265 16100 0.009 - -
2.2404 16200 0.0035 - -
2.2542 16300 0.0001 - -
2.2680 16400 0.008 - -
2.2818 16500 0.0007 - -
2.2957 16600 0.0123 - -
2.3095 16700 0.0363 - -
2.3233 16800 0.0034 - -
2.3372 16900 0.0001 - -
2.3510 17000 0.0219 0.0083 0.4428
2.3648 17100 0.0148 - -
2.3786 17200 0.0 - -
2.3925 17300 0.0005 - -
2.4063 17400 0.0114 - -
2.4201 17500 0.0367 - -
2.4340 17600 0.0163 - -
2.4478 17700 0.0083 - -
2.4616 17800 0.0264 - -
2.4755 17900 0.0059 - -
2.4893 18000 0.001 0.0090 0.4408
2.5031 18100 0.0058 - -
2.5169 18200 0.0002 - -
2.5308 18300 0.0112 - -
2.5446 18400 0.0194 - -
2.5584 18500 0.0356 - -
2.5723 18600 0.0136 - -
2.5861 18700 0.0109 - -
2.5999 18800 0.0184 - -
2.6137 18900 0.0006 - -
2.6276 19000 0.0094 0.0072 0.4510
2.6414 19100 0.0094 - -
2.6552 19200 0.0007 - -
2.6691 19300 0.0108 - -
2.6829 19400 0.0123 - -
2.6967 19500 0.0004 - -
2.7106 19600 0.0004 - -
2.7244 19700 0.0149 - -
2.7382 19800 0.0 - -
2.7520 19900 0.0 - -
2.7659 20000 0.0005 0.0080 0.4617
2.7797 20100 0.0115 - -
2.7935 20200 0.0 - -
2.8074 20300 0.0 - -
2.8212 20400 0.0017 - -
2.8350 20500 0.0225 - -
2.8488 20600 0.0251 - -
2.8627 20700 0.0001 - -
2.8765 20800 0.0013 - -
2.8903 20900 0.0048 - -
2.9042 21000 0.0016 0.0079 0.4548
2.9180 21100 0.0003 - -
2.9318 21200 0.0352 - -
2.9457 21300 0.0044 - -
2.9595 21400 0.0124 - -
2.9733 21500 0.0064 - -
2.9871 21600 0.0086 - -
3.0010 21700 0.0058 - -
3.0148 21800 0.0018 - -
3.0286 21900 0.0132 - -
3.0425 22000 0.0144 0.0080 0.4472
3.0563 22100 0.0248 - -
3.0701 22200 0.0139 - -
3.0839 22300 0.0155 - -
3.0978 22400 0.0115 - -
3.1116 22500 0.0082 - -
3.1254 22600 0.0068 - -
3.1393 22700 0.0 - -
3.1531 22800 0.0178 - -
3.1669 22900 0.0007 - -
3.1807 23000 0.0004 0.0072 0.4689
3.1946 23100 0.0 - -
3.2084 23200 0.0 - -
3.2222 23300 0.0128 - -
3.2361 23400 0.0001 - -
3.2499 23500 0.0027 - -
3.2637 23600 0.0002 - -
3.2776 23700 0.0048 - -
3.2914 23800 0.0063 - -
3.3052 23900 0.0331 - -
3.3190 24000 0.0001 0.0089 0.4881
3.3329 24100 0.025 - -
3.3467 24200 0.0045 - -
3.3605 24300 0.0065 - -
3.3744 24400 0.0003 - -
3.3882 24500 0.0077 - -
3.4020 24600 0.0002 - -
3.4158 24700 0.0095 - -
3.4297 24800 0.0219 - -
3.4435 24900 0.0005 - -
3.4573 25000 0.0114 0.0087 0.4686
3.4712 25100 0.0002 - -
3.4850 25200 0.023 - -
3.4988 25300 0.01 - -
3.5127 25400 0.0114 - -
3.5265 25500 0.0052 - -
3.5403 25600 0.0095 - -
3.5541 25700 0.0205 - -
3.5680 25800 0.0002 - -
3.5818 25900 0.0097 - -
3.5956 26000 0.0207 0.0077 0.4741
3.6095 26100 0.0112 - -
3.6233 26200 0.0045 - -
3.6371 26300 0.0006 - -
3.6509 26400 0.0302 - -
3.6648 26500 0.007 - -
3.6786 26600 0.0005 - -
3.6924 26700 0.0086 - -
3.7063 26800 0.0081 - -
3.7201 26900 0.0006 - -
3.7339 27000 0.0063 0.0099 0.4824
3.7478 27100 0.0198 - -
3.7616 27200 0.0062 - -
3.7754 27300 0.0 - -
3.7892 27400 0.008 - -
3.8031 27500 0.0034 - -
3.8169 27600 0.0005 - -
3.8307 27700 0.0065 - -
3.8446 27800 0.0019 - -
3.8584 27900 0.0108 - -
3.8722 28000 0.0117 0.0069 0.4933
3.8860 28100 0.0106 - -
3.8999 28200 0.0001 - -
3.9137 28300 0.0 - -
3.9275 28400 0.0066 - -
3.9414 28500 0.011 - -
3.9552 28600 0.0 - -
3.9690 28700 0.0004 - -
3.9829 28800 0.0081 - -
3.9967 28900 0.0081 - -
4.0105 29000 0.0122 0.0066 0.5047
4.0243 29100 0.0137 - -
4.0382 29200 0.0098 - -
4.0520 29300 0.0002 - -
4.0658 29400 0.0075 - -
4.0797 29500 0.0 - -
4.0935 29600 0.0256 - -
4.1073 29700 0.0096 - -
4.1211 29800 0.0012 - -
4.1350 29900 0.0048 - -
4.1488 30000 0.0 0.0065 0.4963
4.1626 30100 0.0026 - -
4.1765 30200 0.0025 - -
4.1903 30300 0.0077 - -
4.2041 30400 0.0168 - -
4.2180 30500 0.0377 - -
4.2318 30600 0.0 - -
4.2456 30700 0.0114 - -
4.2594 30800 0.0062 - -
4.2733 30900 0.0135 - -
4.2871 31000 0.0089 0.0080 0.4953
4.3009 31100 0.0106 - -
4.3148 31200 0.0199 - -
4.3286 31300 0.0066 - -
4.3424 31400 0.0003 - -
4.3562 31500 0.0045 - -
4.3701 31600 0.0001 - -
4.3839 31700 0.0064 - -
4.3977 31800 0.0001 - -
4.4116 31900 0.0052 - -
4.4254 32000 0.011 0.0061 0.4994
4.4392 32100 0.0 - -
4.4530 32200 0.015 - -
4.4669 32300 0.0082 - -
4.4807 32400 0.0 - -
4.4945 32500 0.0041 - -
4.5084 32600 0.0067 - -
4.5222 32700 0.0003 - -
4.5360 32800 0.0 - -
4.5499 32900 0.002 - -
4.5637 33000 0.0 0.0064 0.5035
4.5775 33100 0.0 - -
4.5913 33200 0.0058 - -
4.6052 33300 0.0033 - -
4.6190 33400 0.008 - -
4.6328 33500 0.0313 - -
4.6467 33600 0.0294 - -
4.6605 33700 0.0068 - -
4.6743 33800 0.0068 - -
4.6881 33900 0.0213 - -
4.7020 34000 0.0117 0.0076 0.5044
4.7158 34100 0.0001 - -
4.7296 34200 0.0024 - -
4.7435 34300 0.0 - -
4.7573 34400 0.0084 - -
4.7711 34500 0.0091 - -
4.7850 34600 0.0101 - -
4.7988 34700 0.0093 - -
4.8126 34800 0.0138 - -
4.8264 34900 0.0113 - -
4.8403 35000 0.0134 0.0064 0.5127
4.8541 35100 0.0233 - -
4.8679 35200 0.0006 - -
4.8818 35300 0.0 - -
4.8956 35400 0.0095 - -
4.9094 35500 0.0145 - -
4.9232 35600 0.0075 - -
4.9371 35700 0.0006 - -
4.9509 35800 0.0 - -
4.9647 35900 0.0 - -
4.9786 36000 0.0136 0.0060 0.5170
4.9924 36100 0.0197 - -
5.0062 36200 0.0127 - -
5.0201 36300 0.0029 - -
5.0339 36400 0.0028 - -
5.0477 36500 0.011 - -
5.0615 36600 0.0 - -
5.0754 36700 0.0152 - -
5.0892 36800 0.0076 - -
5.1030 36900 0.0138 - -
5.1169 37000 0.0002 0.0063 0.5164
5.1307 37100 0.0051 - -
5.1445 37200 0.0158 - -
5.1583 37300 0.0063 - -
5.1722 37400 0.017 - -
5.1860 37500 0.0115 - -
5.1998 37600 0.0001 - -
5.2137 37700 0.0072 - -
5.2275 37800 0.0022 - -
5.2413 37900 0.0045 - -
5.2552 38000 0.0012 0.0052 0.5276
5.2690 38100 0.0 - -
5.2828 38200 0.0127 - -
5.2966 38300 0.006 - -
5.3105 38400 0.0075 - -
5.3243 38500 0.0 - -
5.3381 38600 0.0001 - -
5.3520 38700 0.0 - -
5.3658 38800 0.0 - -
5.3796 38900 0.0 - -
5.3934 39000 0.0003 0.0053 0.5280
5.4073 39100 0.0 - -
5.4211 39200 0.0 - -
5.4349 39300 0.0 - -
5.4488 39400 0.0002 - -
5.4626 39500 0.0076 - -
5.4764 39600 0.0016 - -
5.4903 39700 0.0001 - -
5.5041 39800 0.0 - -
5.5179 39900 0.0 - -
5.5317 40000 0.0 0.0052 0.5257
5.5456 40100 0.0081 - -
5.5594 40200 0.0058 - -
5.5732 40300 0.0067 - -
5.5871 40400 0.007 - -
5.6009 40500 0.0085 - -
5.6147 40600 0.0015 - -
5.6285 40700 0.0016 - -
5.6424 40800 0.0007 - -
5.6562 40900 0.0 - -
5.6700 41000 0.0 0.0054 0.5337
5.6839 41100 0.0 - -
5.6977 41200 0.0 - -
5.7115 41300 0.0151 - -
5.7253 41400 0.007 - -
5.7392 41500 0.0 - -
5.7530 41600 0.0052 - -
5.7668 41700 0.0075 - -
5.7807 41800 0.0099 - -
5.7945 41900 0.0027 - -
5.8083 42000 0.0001 0.0053 0.5346
5.8222 42100 0.0003 - -
5.8360 42200 0.0 - -
5.8498 42300 0.0 - -
5.8636 42400 0.0055 - -
5.8775 42500 0.0105 - -
5.8913 42600 0.007 - -
5.9051 42700 0.0001 - -
5.9190 42800 0.0095 - -
5.9328 42900 0.0075 - -
5.9466 43000 0.0191 0.0052 0.5371
5.9604 43100 0.0002 - -
5.9743 43200 0.0 - -
5.9881 43300 0.004 - -

Framework Versions

  • Python: 3.11.13
  • Sentence Transformers: 4.1.0
  • Transformers: 4.52.4
  • PyTorch: 2.6.0+cu124
  • Accelerate: 1.8.1
  • Datasets: 3.6.0
  • Tokenizers: 0.21.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for meandyou200175/sp_chatbot_query

Finetuned
(291)
this model

Papers for meandyou200175/sp_chatbot_query

Evaluation results