SentenceTransformer based on meandyou200175/E5_v3_instruct_topic
This is a sentence-transformers model finetuned from meandyou200175/E5_v3_instruct_topic. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: meandyou200175/E5_v3_instruct_topic
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
Model Sources
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
model = SentenceTransformer("meandyou200175/E5_v4_instruct_topic_continue")
sentences = [
'task: classification | query: Học lịch sử cười đau bụng, vừa vui vừa dễ nhớ!',
'Lịch sử',
'Âm nhạc',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
similarities = model.similarity(embeddings, embeddings)
print(similarities)
Evaluation
Metrics
Information Retrieval
| Metric |
Value |
| cosine_accuracy@1 |
0.0329 |
| cosine_accuracy@2 |
0.0633 |
| cosine_accuracy@5 |
0.1402 |
| cosine_accuracy@10 |
0.2393 |
| cosine_accuracy@100 |
0.772 |
| cosine_precision@1 |
0.0329 |
| cosine_precision@2 |
0.0316 |
| cosine_precision@5 |
0.028 |
| cosine_precision@10 |
0.0239 |
| cosine_precision@100 |
0.0077 |
| cosine_recall@1 |
0.0329 |
| cosine_recall@2 |
0.0633 |
| cosine_recall@5 |
0.1402 |
| cosine_recall@10 |
0.2393 |
| cosine_recall@100 |
0.772 |
| cosine_ndcg@10 |
0.1185 |
| cosine_mrr@1 |
0.0329 |
| cosine_mrr@2 |
0.0481 |
| cosine_mrr@5 |
0.0691 |
| cosine_mrr@10 |
0.0821 |
| cosine_mrr@100 |
0.102 |
| cosine_map@100 |
0.102 |
Training Details
Training Dataset
Unnamed Dataset
Evaluation Dataset
Unnamed Dataset
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: steps
per_device_train_batch_size: 4
per_device_eval_batch_size: 4
learning_rate: 2e-05
num_train_epochs: 5
warmup_ratio: 0.1
fp16: True
batch_sampler: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir: False
do_predict: False
eval_strategy: steps
prediction_loss_only: True
per_device_train_batch_size: 4
per_device_eval_batch_size: 4
per_gpu_train_batch_size: None
per_gpu_eval_batch_size: None
gradient_accumulation_steps: 1
eval_accumulation_steps: None
torch_empty_cache_steps: None
learning_rate: 2e-05
weight_decay: 0.0
adam_beta1: 0.9
adam_beta2: 0.999
adam_epsilon: 1e-08
max_grad_norm: 1.0
num_train_epochs: 5
max_steps: -1
lr_scheduler_type: linear
lr_scheduler_kwargs: {}
warmup_ratio: 0.1
warmup_steps: 0
log_level: passive
log_level_replica: warning
log_on_each_node: True
logging_nan_inf_filter: True
save_safetensors: True
save_on_each_node: False
save_only_model: False
restore_callback_states_from_checkpoint: False
no_cuda: False
use_cpu: False
use_mps_device: False
seed: 42
data_seed: None
jit_mode_eval: False
use_ipex: False
bf16: False
fp16: True
fp16_opt_level: O1
half_precision_backend: auto
bf16_full_eval: False
fp16_full_eval: False
tf32: None
local_rank: 0
ddp_backend: None
tpu_num_cores: None
tpu_metrics_debug: False
debug: []
dataloader_drop_last: False
dataloader_num_workers: 0
dataloader_prefetch_factor: None
past_index: -1
disable_tqdm: False
remove_unused_columns: True
label_names: None
load_best_model_at_end: False
ignore_data_skip: False
fsdp: []
fsdp_min_num_params: 0
fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
fsdp_transformer_layer_cls_to_wrap: None
accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
deepspeed: None
label_smoothing_factor: 0.0
optim: adamw_torch
optim_args: None
adafactor: False
group_by_length: False
length_column_name: length
ddp_find_unused_parameters: None
ddp_bucket_cap_mb: None
ddp_broadcast_buffers: False
dataloader_pin_memory: True
dataloader_persistent_workers: False
skip_memory_metrics: True
use_legacy_prediction_loop: False
push_to_hub: False
resume_from_checkpoint: None
hub_model_id: None
hub_strategy: every_save
hub_private_repo: None
hub_always_push: False
hub_revision: None
gradient_checkpointing: False
gradient_checkpointing_kwargs: None
include_inputs_for_metrics: False
include_for_metrics: []
eval_do_concat_batches: True
fp16_backend: auto
push_to_hub_model_id: None
push_to_hub_organization: None
mp_parameters:
auto_find_batch_size: False
full_determinism: False
torchdynamo: None
ray_scope: last
ddp_timeout: 1800
torch_compile: False
torch_compile_backend: None
torch_compile_mode: None
include_tokens_per_second: False
include_num_input_tokens_seen: False
neftune_noise_alpha: None
optim_target_modules: None
batch_eval_metrics: False
eval_on_start: False
use_liger_kernel: False
liger_kernel_config: None
eval_use_gather_object: False
average_tokens_across_devices: False
prompts: None
batch_sampler: no_duplicates
multi_dataset_batch_sampler: proportional
router_mapping: {}
learning_rate_mapping: {}
Training Logs
Click to expand
| Epoch |
Step |
Training Loss |
Validation Loss |
cosine_ndcg@10 |
| 0.0185 |
100 |
2.469 |
- |
- |
| 0.0370 |
200 |
1.3544 |
- |
- |
| 0.0555 |
300 |
0.6754 |
- |
- |
| 0.0740 |
400 |
0.6252 |
- |
- |
| 0.0925 |
500 |
0.4433 |
- |
- |
| 0.1110 |
600 |
0.5628 |
- |
- |
| 0.1295 |
700 |
0.3955 |
- |
- |
| 0.1480 |
800 |
0.4755 |
- |
- |
| 0.1665 |
900 |
0.3112 |
- |
- |
| 0.1850 |
1000 |
0.3946 |
0.3953 |
0.0805 |
| 0.2035 |
1100 |
0.3676 |
- |
- |
| 0.2220 |
1200 |
0.3984 |
- |
- |
| 0.2405 |
1300 |
0.3767 |
- |
- |
| 0.2590 |
1400 |
0.3796 |
- |
- |
| 0.2775 |
1500 |
0.3332 |
- |
- |
| 0.2960 |
1600 |
0.4419 |
- |
- |
| 0.3145 |
1700 |
0.4107 |
- |
- |
| 0.3330 |
1800 |
0.3513 |
- |
- |
| 0.3515 |
1900 |
0.3502 |
- |
- |
| 0.3700 |
2000 |
0.4331 |
0.3697 |
0.0884 |
| 0.3885 |
2100 |
0.5259 |
- |
- |
| 0.4070 |
2200 |
0.4406 |
- |
- |
| 0.4255 |
2300 |
0.4705 |
- |
- |
| 0.4440 |
2400 |
0.3596 |
- |
- |
| 0.4625 |
2500 |
0.2859 |
- |
- |
| 0.4810 |
2600 |
0.3895 |
- |
- |
| 0.4995 |
2700 |
0.4653 |
- |
- |
| 0.5180 |
2800 |
0.3776 |
- |
- |
| 0.5365 |
2900 |
0.4929 |
- |
- |
| 0.5550 |
3000 |
0.31 |
0.4504 |
0.0847 |
| 0.5735 |
3100 |
0.3791 |
- |
- |
| 0.5920 |
3200 |
0.3522 |
- |
- |
| 0.6105 |
3300 |
0.3995 |
- |
- |
| 0.6290 |
3400 |
0.3699 |
- |
- |
| 0.6475 |
3500 |
0.3751 |
- |
- |
| 0.6660 |
3600 |
0.3472 |
- |
- |
| 0.6846 |
3700 |
0.3968 |
- |
- |
| 0.7031 |
3800 |
0.4328 |
- |
- |
| 0.7216 |
3900 |
0.4753 |
- |
- |
| 0.7401 |
4000 |
0.3527 |
0.3451 |
0.0974 |
| 0.7586 |
4100 |
0.506 |
- |
- |
| 0.7771 |
4200 |
0.4896 |
- |
- |
| 0.7956 |
4300 |
0.4368 |
- |
- |
| 0.8141 |
4400 |
0.373 |
- |
- |
| 0.8326 |
4500 |
0.3498 |
- |
- |
| 0.8511 |
4600 |
0.3926 |
- |
- |
| 0.8696 |
4700 |
0.3924 |
- |
- |
| 0.8881 |
4800 |
0.4206 |
- |
- |
| 0.9066 |
4900 |
0.4101 |
- |
- |
| 0.9251 |
5000 |
0.4193 |
0.3383 |
0.0910 |
| 0.9436 |
5100 |
0.3777 |
- |
- |
| 0.9621 |
5200 |
0.3059 |
- |
- |
| 0.9806 |
5300 |
0.4198 |
- |
- |
| 0.9991 |
5400 |
0.2563 |
- |
- |
| 1.0176 |
5500 |
0.225 |
- |
- |
| 1.0361 |
5600 |
0.3237 |
- |
- |
| 1.0546 |
5700 |
0.2978 |
- |
- |
| 1.0731 |
5800 |
0.3044 |
- |
- |
| 1.0916 |
5900 |
0.2087 |
- |
- |
| 1.1101 |
6000 |
0.2689 |
0.3643 |
0.0988 |
| 1.1286 |
6100 |
0.3699 |
- |
- |
| 1.1471 |
6200 |
0.2942 |
- |
- |
| 1.1656 |
6300 |
0.2929 |
- |
- |
| 1.1841 |
6400 |
0.3152 |
- |
- |
| 1.2026 |
6500 |
0.3352 |
- |
- |
| 1.2211 |
6600 |
0.3146 |
- |
- |
| 1.2396 |
6700 |
0.3873 |
- |
- |
| 1.2581 |
6800 |
0.258 |
- |
- |
| 1.2766 |
6900 |
0.1435 |
- |
- |
| 1.2951 |
7000 |
0.2508 |
0.3768 |
0.0966 |
| 1.3136 |
7100 |
0.2884 |
- |
- |
| 1.3321 |
7200 |
0.2962 |
- |
- |
| 1.3506 |
7300 |
0.1903 |
- |
- |
| 1.3691 |
7400 |
0.2946 |
- |
- |
| 1.3876 |
7500 |
0.2658 |
- |
- |
| 1.4061 |
7600 |
0.2052 |
- |
- |
| 1.4246 |
7700 |
0.3019 |
- |
- |
| 1.4431 |
7800 |
0.3147 |
- |
- |
| 1.4616 |
7900 |
0.4272 |
- |
- |
| 1.4801 |
8000 |
0.2707 |
0.3430 |
0.1000 |
| 1.4986 |
8100 |
0.3127 |
- |
- |
| 1.5171 |
8200 |
0.2775 |
- |
- |
| 1.5356 |
8300 |
0.2783 |
- |
- |
| 1.5541 |
8400 |
0.3092 |
- |
- |
| 1.5726 |
8500 |
0.35 |
- |
- |
| 1.5911 |
8600 |
0.3076 |
- |
- |
| 1.6096 |
8700 |
0.2935 |
- |
- |
| 1.6281 |
8800 |
0.3629 |
- |
- |
| 1.6466 |
8900 |
0.2885 |
- |
- |
| 1.6651 |
9000 |
0.3249 |
0.3294 |
0.0997 |
| 1.6836 |
9100 |
0.2983 |
- |
- |
| 1.7021 |
9200 |
0.3599 |
- |
- |
| 1.7206 |
9300 |
0.2341 |
- |
- |
| 1.7391 |
9400 |
0.4031 |
- |
- |
| 1.7576 |
9500 |
0.3911 |
- |
- |
| 1.7761 |
9600 |
0.3025 |
- |
- |
| 1.7946 |
9700 |
0.2315 |
- |
- |
| 1.8131 |
9800 |
0.2946 |
- |
- |
| 1.8316 |
9900 |
0.2679 |
- |
- |
| 1.8501 |
10000 |
0.3445 |
0.3247 |
0.1015 |
| 1.8686 |
10100 |
0.2243 |
- |
- |
| 1.8871 |
10200 |
0.3345 |
- |
- |
| 1.9056 |
10300 |
0.2642 |
- |
- |
| 1.9241 |
10400 |
0.2012 |
- |
- |
| 1.9426 |
10500 |
0.211 |
- |
- |
| 1.9611 |
10600 |
0.2834 |
- |
- |
| 1.9796 |
10700 |
0.2376 |
- |
- |
| 1.9981 |
10800 |
0.2351 |
- |
- |
| 2.0167 |
10900 |
0.1985 |
- |
- |
| 2.0352 |
11000 |
0.2464 |
0.3235 |
0.1079 |
| 2.0537 |
11100 |
0.2602 |
- |
- |
| 2.0722 |
11200 |
0.176 |
- |
- |
| 2.0907 |
11300 |
0.2486 |
- |
- |
| 2.1092 |
11400 |
0.2541 |
- |
- |
| 2.1277 |
11500 |
0.1925 |
- |
- |
| 2.1462 |
11600 |
0.2509 |
- |
- |
| 2.1647 |
11700 |
0.1799 |
- |
- |
| 2.1832 |
11800 |
0.219 |
- |
- |
| 2.2017 |
11900 |
0.2076 |
- |
- |
| 2.2202 |
12000 |
0.2285 |
0.3028 |
0.1061 |
| 2.2387 |
12100 |
0.1823 |
- |
- |
| 2.2572 |
12200 |
0.1999 |
- |
- |
| 2.2757 |
12300 |
0.1392 |
- |
- |
| 2.2942 |
12400 |
0.2552 |
- |
- |
| 2.3127 |
12500 |
0.2481 |
- |
- |
| 2.3312 |
12600 |
0.2164 |
- |
- |
| 2.3497 |
12700 |
0.2157 |
- |
- |
| 2.3682 |
12800 |
0.1425 |
- |
- |
| 2.3867 |
12900 |
0.0909 |
- |
- |
| 2.4052 |
13000 |
0.2931 |
0.3439 |
0.1011 |
| 2.4237 |
13100 |
0.2031 |
- |
- |
| 2.4422 |
13200 |
0.0993 |
- |
- |
| 2.4607 |
13300 |
0.1865 |
- |
- |
| 2.4792 |
13400 |
0.208 |
- |
- |
| 2.4977 |
13500 |
0.2853 |
- |
- |
| 2.5162 |
13600 |
0.1936 |
- |
- |
| 2.5347 |
13700 |
0.1752 |
- |
- |
| 2.5532 |
13800 |
0.2559 |
- |
- |
| 2.5717 |
13900 |
0.2441 |
- |
- |
| 2.5902 |
14000 |
0.2715 |
0.2953 |
0.1098 |
| 2.6087 |
14100 |
0.196 |
- |
- |
| 2.6272 |
14200 |
0.2194 |
- |
- |
| 2.6457 |
14300 |
0.2381 |
- |
- |
| 2.6642 |
14400 |
0.2637 |
- |
- |
| 2.6827 |
14500 |
0.1453 |
- |
- |
| 2.7012 |
14600 |
0.2422 |
- |
- |
| 2.7197 |
14700 |
0.2159 |
- |
- |
| 2.7382 |
14800 |
0.2205 |
- |
- |
| 2.7567 |
14900 |
0.1853 |
- |
- |
| 2.7752 |
15000 |
0.2028 |
0.2925 |
0.1072 |
| 2.7937 |
15100 |
0.2016 |
- |
- |
| 2.8122 |
15200 |
0.155 |
- |
- |
| 2.8307 |
15300 |
0.1925 |
- |
- |
| 2.8492 |
15400 |
0.2408 |
- |
- |
| 2.8677 |
15500 |
0.1464 |
- |
- |
| 2.8862 |
15600 |
0.2035 |
- |
- |
| 2.9047 |
15700 |
0.1883 |
- |
- |
| 2.9232 |
15800 |
0.1747 |
- |
- |
| 2.9417 |
15900 |
0.251 |
- |
- |
| 2.9602 |
16000 |
0.2151 |
0.2953 |
0.1117 |
| 2.9787 |
16100 |
0.226 |
- |
- |
| 2.9972 |
16200 |
0.1442 |
- |
- |
| 3.0157 |
16300 |
0.191 |
- |
- |
| 3.0342 |
16400 |
0.1304 |
- |
- |
| 3.0527 |
16500 |
0.2252 |
- |
- |
| 3.0712 |
16600 |
0.1846 |
- |
- |
| 3.0897 |
16700 |
0.1608 |
- |
- |
| 3.1082 |
16800 |
0.1582 |
- |
- |
| 3.1267 |
16900 |
0.1602 |
- |
- |
| 3.1452 |
17000 |
0.1086 |
0.2637 |
0.1048 |
| 3.1637 |
17100 |
0.1155 |
- |
- |
| 3.1822 |
17200 |
0.113 |
- |
- |
| 3.2007 |
17300 |
0.1622 |
- |
- |
| 3.2192 |
17400 |
0.1963 |
- |
- |
| 3.2377 |
17500 |
0.1556 |
- |
- |
| 3.2562 |
17600 |
0.0897 |
- |
- |
| 3.2747 |
17700 |
0.0999 |
- |
- |
| 3.2932 |
17800 |
0.1499 |
- |
- |
| 3.3117 |
17900 |
0.2365 |
- |
- |
| 3.3302 |
18000 |
0.146 |
0.2748 |
0.1113 |
| 3.3488 |
18100 |
0.1591 |
- |
- |
| 3.3673 |
18200 |
0.1885 |
- |
- |
| 3.3858 |
18300 |
0.1959 |
- |
- |
| 3.4043 |
18400 |
0.076 |
- |
- |
| 3.4228 |
18500 |
0.176 |
- |
- |
| 3.4413 |
18600 |
0.1378 |
- |
- |
| 3.4598 |
18700 |
0.0648 |
- |
- |
| 3.4783 |
18800 |
0.1488 |
- |
- |
| 3.4968 |
18900 |
0.1361 |
- |
- |
| 3.5153 |
19000 |
0.1573 |
0.2878 |
0.1096 |
| 3.5338 |
19100 |
0.2488 |
- |
- |
| 3.5523 |
19200 |
0.1086 |
- |
- |
| 3.5708 |
19300 |
0.1405 |
- |
- |
| 3.5893 |
19400 |
0.0423 |
- |
- |
| 3.6078 |
19500 |
0.1069 |
- |
- |
| 3.6263 |
19600 |
0.088 |
- |
- |
| 3.6448 |
19700 |
0.1489 |
- |
- |
| 3.6633 |
19800 |
0.0865 |
- |
- |
| 3.6818 |
19900 |
0.1839 |
- |
- |
| 3.7003 |
20000 |
0.1476 |
0.2914 |
0.1159 |
| 3.7188 |
20100 |
0.2212 |
- |
- |
| 3.7373 |
20200 |
0.1638 |
- |
- |
| 3.7558 |
20300 |
0.0782 |
- |
- |
| 3.7743 |
20400 |
0.1215 |
- |
- |
| 3.7928 |
20500 |
0.1478 |
- |
- |
| 3.8113 |
20600 |
0.1934 |
- |
- |
| 3.8298 |
20700 |
0.1594 |
- |
- |
| 3.8483 |
20800 |
0.1216 |
- |
- |
| 3.8668 |
20900 |
0.2124 |
- |
- |
| 3.8853 |
21000 |
0.0981 |
0.2789 |
0.1141 |
| 3.9038 |
21100 |
0.126 |
- |
- |
| 3.9223 |
21200 |
0.1077 |
- |
- |
| 3.9408 |
21300 |
0.1176 |
- |
- |
| 3.9593 |
21400 |
0.1776 |
- |
- |
| 3.9778 |
21500 |
0.094 |
- |
- |
| 3.9963 |
21600 |
0.1025 |
- |
- |
| 4.0148 |
21700 |
0.1589 |
- |
- |
| 4.0333 |
21800 |
0.1142 |
- |
- |
| 4.0518 |
21900 |
0.1656 |
- |
- |
| 4.0703 |
22000 |
0.0577 |
0.2660 |
0.1105 |
| 4.0888 |
22100 |
0.0911 |
- |
- |
| 4.1073 |
22200 |
0.0844 |
- |
- |
| 4.1258 |
22300 |
0.0606 |
- |
- |
| 4.1443 |
22400 |
0.1653 |
- |
- |
| 4.1628 |
22500 |
0.0968 |
- |
- |
| 4.1813 |
22600 |
0.055 |
- |
- |
| 4.1998 |
22700 |
0.1013 |
- |
- |
| 4.2183 |
22800 |
0.0587 |
- |
- |
| 4.2368 |
22900 |
0.1309 |
- |
- |
| 4.2553 |
23000 |
0.053 |
0.2554 |
0.1165 |
| 4.2738 |
23100 |
0.1312 |
- |
- |
| 4.2923 |
23200 |
0.1208 |
- |
- |
| 4.3108 |
23300 |
0.159 |
- |
- |
| 4.3293 |
23400 |
0.1135 |
- |
- |
| 4.3478 |
23500 |
0.0956 |
- |
- |
| 4.3663 |
23600 |
0.1353 |
- |
- |
| 4.3848 |
23700 |
0.1623 |
- |
- |
| 4.4033 |
23800 |
0.1296 |
- |
- |
| 4.4218 |
23900 |
0.1103 |
- |
- |
| 4.4403 |
24000 |
0.0837 |
0.2514 |
0.1175 |
| 4.4588 |
24100 |
0.1124 |
- |
- |
| 4.4773 |
24200 |
0.0893 |
- |
- |
| 4.4958 |
24300 |
0.0852 |
- |
- |
| 4.5143 |
24400 |
0.152 |
- |
- |
| 4.5328 |
24500 |
0.0731 |
- |
- |
| 4.5513 |
24600 |
0.1839 |
- |
- |
| 4.5698 |
24700 |
0.0393 |
- |
- |
| 4.5883 |
24800 |
0.1167 |
- |
- |
| 4.6068 |
24900 |
0.0909 |
- |
- |
| 4.6253 |
25000 |
0.098 |
0.2621 |
0.1196 |
| 4.6438 |
25100 |
0.1655 |
- |
- |
| 4.6623 |
25200 |
0.1086 |
- |
- |
| 4.6809 |
25300 |
0.116 |
- |
- |
| 4.6994 |
25400 |
0.0594 |
- |
- |
| 4.7179 |
25500 |
0.0677 |
- |
- |
| 4.7364 |
25600 |
0.0915 |
- |
- |
| 4.7549 |
25700 |
0.0784 |
- |
- |
| 4.7734 |
25800 |
0.0746 |
- |
- |
| 4.7919 |
25900 |
0.0613 |
- |
- |
| 4.8104 |
26000 |
0.0682 |
0.2570 |
0.1189 |
| 4.8289 |
26100 |
0.1423 |
- |
- |
| 4.8474 |
26200 |
0.1023 |
- |
- |
| 4.8659 |
26300 |
0.085 |
- |
- |
| 4.8844 |
26400 |
0.0916 |
- |
- |
| 4.9029 |
26500 |
0.1068 |
- |
- |
| 4.9214 |
26600 |
0.1184 |
- |
- |
| 4.9399 |
26700 |
0.0873 |
- |
- |
| 4.9584 |
26800 |
0.136 |
- |
- |
| 4.9769 |
26900 |
0.1196 |
- |
- |
| 4.9954 |
27000 |
0.1096 |
0.2472 |
0.1185 |
Framework Versions
- Python: 3.11.13
- Sentence Transformers: 5.1.2
- Transformers: 4.53.3
- PyTorch: 2.6.0+cu124
- Accelerate: 1.9.0
- Datasets: 4.4.1
- Tokenizers: 0.21.2
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}