metadata
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:20000
- loss:CosineSimilarityLoss
base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
widget:
- source_sentence: >-
Question: Is this describing a (1) directly correlative relationship, (2)
conditionally causative relationship, (3) causative relationship, or (0)
no relationship.
sentences:
- 'C: Iron deficiency anemia in the mother; normal Hb levels in the fetus'
- This is a conditionally causative relationship
- 'C: Decreasing carbohydrate intake, increasing fat intake'
- source_sentence: Please summerize the given abstract to a title
sentences:
- 'BatteryLab: A Collaborative Platform for Power Monitoring'
- >-
hi ! good evening. i am chatbot answering your query. from the history,
it seems that you might have sustained some kind of trivial trauma while
cutting woods resulting in oozing of blood in the tissue forming a
collection of blood (hematoma). usually, small collections of blood get
absorbed of their own. however, this may not happen in cases where the
blood clotting is hampered by the intake of blood thinners as is in your
case and the same might also get infected causing more pain due to an
abscess. if i were your doctor, i would consult your physician who
started your blood thinning agent for consideration of discontinuing
these medicines for some time till it heals up. if it does not even
then, i would refer you to a general surgeon for a clinical examination
and further management. i hope this information would help you in
discussing with your family physician/treating doctor in further
management of your problem. please do not hesitate to ask in case of any
further doubts. thanks for choosing chatbot to clear doubts on your
health problems. wishing you an early recovery. chatbot. if i were your
doctor,
- >-
Effects of the psychoactive compounds in green tea on risky
decision-making.
- source_sentence: Answer this question truthfully
sentences:
- >-
Laparoscopic stomach-partitioning gastrojejunostomy with reduced-port
techniques for unresectable distal gastric cancer.
- >-
hi, thanks for posting the query, i would suggest you to get an x-ray of
the tooth piece left in the socket, according to your clinical symptoms
i suppose that you might have developed an infection in the region which
is radiating in the nearby tooth region giving you such feeling, also
take course of antibiotics and analgesics, maintain a good oral hygiene,
take lukewarm saline and antiseptic mouthwash rinses, take an
appointment with oral surgeon and get the piece removed. hope you find
this as helpful, take care!
- >-
If you feel you are developing symptoms suggestive of Pneumocystis
pneumonia contact your health professional.
- source_sentence: >-
If you are a doctor, please answer the medical questions based on the
patient's description.
sentences:
- Hazard control for communicable disease transport at Ornge
- >-
hello and thank you for asking chatbot, i understand your concern. you
are probably experiencing low blood pressure when you stand up, called
orthostatic hypotension. as a result, not enough blood reaches your
brain, and you feel lightheaded or dizzy. here are some advices
- >-
hi, thank you for posting your query. i have noted your symptoms. these
are suggestive of sciatica, or nerve compression in the lower back
region due to slipped disc in that location. disc prolapse leads to
compression of the nerves, resulting in low back pain, leg pain and
tingling. symptoms may increase on walking. the diagnosis can be
confirmed by doing mri scan of the lumbosacral spine. good medical
treatments are available for this condition. i hope my answer helps.
please get back if you have any follow-up queries or if you require any
additional information. wishing you good health, chatbot. ly/
- source_sentence: Please summerize the given abstract to a title
sentences:
- >-
Gastric mucormycosis with splenic invasion a rare abdominal complication
of COVID-19 pneumonia
- >-
Russian-Language Mobile Apps for Reducing Alcohol Use: Systematic Search
and Evaluation
- Peacekeeping after Covid-19
pipeline_tag: sentence-similarity
library_name: sentence-transformers
SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
This is a sentence-transformers model finetuned from sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
- Maximum Sequence Length: 128 tokens
- Output Dimensionality: 384 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Please summerize the given abstract to a title',
'Peacekeeping after Covid-19',
'Russian-Language Mobile Apps for Reducing Alcohol Use: Systematic Search and Evaluation',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 20,000 training samples
- Columns:
sentence_0,sentence_1, andlabel - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 label type string string float details - min: 7 tokens
- mean: 15.87 tokens
- max: 81 tokens
- min: 3 tokens
- mean: 77.94 tokens
- max: 128 tokens
- min: 1.0
- mean: 1.0
- max: 1.0
- Samples:
sentence_0 sentence_1 label Please summerize the given abstract to a titleImpact of National Containment Measures on Decelerating the Increase in Daily New Cases of COVID-19 in 54 Countries and 4 Epicenters of the Pandemic: Comparative Observational Study1.0Answer this question truthfullyIntracranial hypertension is defined as ICP greater than 20 mmHg. This condition occurs when there is increased pressure inside the skull, which can cause a range of symptoms and potentially lead to serious complications such as brain damage or herniation. Intracranial hypertension can be caused by a variety of factors, including head injury, brain tumors, infections, and certain medications. Treatment options may include medications to reduce pressure, surgery to relieve pressure or address underlying causes, or other supportive measures to manage symptoms and prevent complications.1.0Answer this question truthfullyThe bone marrow is a rapidly proliferating population of cells that produces blood cells, including white blood cells, red blood cells, and platelets. 6-mercaptopurine and azathioprine are medications that are commonly used to treat autoimmune diseases and some types of cancer. However, because these drugs interfere with the production of new cells, they can also cause myelosuppression, which is a condition in which the bone marrow produces fewer blood cells than normal. This can lead to a variety of symptoms, including fatigue, weakness, and an increased risk of infection.1.0 - Loss:
CosineSimilarityLosswith these parameters:{ "loss_fct": "torch.nn.modules.loss.MSELoss" }
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size: 16per_device_eval_batch_size: 16num_train_epochs: 1multi_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 16per_device_eval_batch_size: 16per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Falsefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size: 0fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin
Training Logs
| Epoch | Step | Training Loss |
|---|---|---|
| 0.4 | 500 | 0.4093 |
| 0.8 | 1000 | 0.0074 |
Framework Versions
- Python: 3.11.12
- Sentence Transformers: 3.4.1
- Transformers: 4.51.3
- PyTorch: 2.6.0+cu124
- Accelerate: 1.6.0
- Datasets: 3.5.1
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}