SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("along26/all-MiniLM-L6-v2_multilingual_malaysian")
# Run inference
sentences = [
    "What is the intensity of light transmitted through two polarizers with their axes at an angle of 45 degrees to each other, if the intensity of the incident light is 12 W/m² and the polarizer absorbs 50% of the light perpendicular to its axis? Use Malus' Law to solve the problem.",
    'Apakah keamatan cahaya yang dihantar melalui dua polarizer dengan paksinya pada sudut 45 darjah antara satu sama lain, jika keamatan cahaya kejadian ialah 12 W/m² dan polarizer menyerap 50% cahaya berserenjang dengan paksinya? Gunakan Hukum Malus untuk menyelesaikan masalah.',
    'What role did the opposition parties and civil society organizations play in exposing the 1MDB scandal and holding Najib Razak accountable?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000, -0.7535,  0.9687],
#         [-0.7535,  1.0000, -0.7616],
#         [ 0.9687, -0.7616,  1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 210,285 training samples
  • Columns: sentence_0, sentence_1, and sentence_2
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 sentence_2
    type string string string
    details
    • min: 11 tokens
    • mean: 215.9 tokens
    • max: 512 tokens
    • min: 14 tokens
    • mean: 257.41 tokens
    • max: 512 tokens
    • min: 15 tokens
    • mean: 236.24 tokens
    • max: 512 tokens
  • Samples:
    sentence_0 sentence_1 sentence_2
    What are the four main functions of the human liver, and how is its unique anatomic structure suited to perform these functions? Apakah empat fungsi utama hati manusia, dan bagaimanakah struktur anatomi uniknya sesuai untuk melaksanakan fungsi ini? Why is the Malaysian government not doing enough to address the rising cost of living and income inequality?
    Changing the temperature affects the equilibrium constant (Kc) and the formation of the Fe(SCN)2+ complex ion from Fe3+ and SCN- ions according to Le Chatelier's principle. Le Chatelier's principle states that if a system at equilibrium is subjected to a change in temperature, pressure, or concentration of reactants or products, the system will adjust its position to counteract the change and re-establish equilibrium.

    In the case of the reaction between Fe3+ and SCN- ions to form the Fe(SCN)2+ complex ion, the balanced chemical equation is:

    Fe3+ (aq) + SCN- (aq) ⇌ Fe(SCN)2+ (aq)

    The equilibrium constant (Kc) for this reaction is given by:

    Kc = [Fe(SCN)2+] / ([Fe3+] [SCN-])

    Now, let's consider the effect of temperature on this reaction. The reaction between Fe3+ and SCN- ions is an exothermic reaction, meaning it releases heat as it proceeds. According to Le Chatelier's principle, if the temperature of the system is increased, the equilibrium will shift in the direction that absorb...
    Menukar suhu memberi kesan kepada pemalar keseimbangan (Kc) dan pembentukan ion kompleks Fe(SCN)2+ daripada ion Fe3+ dan SCN- mengikut prinsip Le Chatelier. Prinsip Le Chatelier menyatakan bahawa jika sistem pada keseimbangan tertakluk kepada perubahan suhu, tekanan, atau kepekatan bahan tindak balas atau produk, sistem akan menyesuaikan kedudukannya untuk mengatasi perubahan dan mewujudkan semula keseimbangan.

    Dalam kes tindak balas antara ion Fe3+ dan SCN- untuk membentuk ion kompleks Fe(SCN)2+, persamaan kimia yang seimbang ialah:

    Fe3+ (aq) + SCN- (aq) ⇌ Fe(SCN)2+ (aq)

    Pemalar keseimbangan (Kc) untuk tindak balas ini diberikan oleh:

    Kc = [Fe(SCN)2+] / ([Fe3+] [SCN-])

    Sekarang, mari kita pertimbangkan kesan suhu pada tindak balas ini. Tindak balas antara ion Fe3+ dan SCN- ialah tindak balas eksotermik, bermakna ia membebaskan haba semasa ia berjalan. Mengikut prinsip Le Chatelier, jika suhu sistem dinaikkan, keseimbangan akan beralih ke arah yang menyerap haba, yang dalam kes in...
    Why does Malaysia have one of the highest income disparities in the world, with a significant portion of the population living in poverty despite being a middle-income country?
    The use of laws like the Official Secrets Act (OSA) and Sedition Act in Malaysia has been criticized for stifling free speech and discouraging whistleblowing in the country's anti-corruption efforts.

    The Official Secrets Act (OSA) is a law that dates back to the colonial era and is intended to protect national security and sensitive information. However, critics argue that the law is overly broad and has been used to suppress freedom of speech and silence whistleblowers who expose corruption and abuse of power. The law imposes strict penalties for the unauthorized disclosure of confidential information, which has created a chilling effect on whistleblowers and investigative journalists who might otherwise expose corruption.

    Similarly, the Sedition Act is a law that criminalizes speech that is deemed seditious, including speech that is likely to cause public disorder, insult the rulers, or question the legitimacy of the government. Critics argue that the law is overly broad and has be...
    Penggunaan undang-undang seperti Akta Rahsia Rasmi (OSA) dan Akta Hasutan di Malaysia telah dikritik kerana menyekat kebebasan bersuara dan menghalang pemberi maklumat dalam usaha anti-rasuah negara.

    Akta Rahsia Rasmi (OSA) ialah undang-undang yang bermula sejak zaman penjajah dan bertujuan untuk melindungi keselamatan negara dan maklumat sensitif. Walau bagaimanapun, pengkritik berpendapat bahawa undang-undang itu terlalu luas dan telah digunakan untuk menyekat kebebasan bersuara dan menutup mulut pemberi maklumat yang mendedahkan rasuah dan penyalahgunaan kuasa. Undang-undang mengenakan penalti yang ketat untuk pendedahan maklumat sulit yang tidak dibenarkan, yang telah mewujudkan kesan menyeramkan kepada pemberi maklumat dan wartawan penyiasat yang mungkin mendedahkan rasuah.

    Begitu juga, Akta Hasutan ialah undang-undang yang menjenayahkan ucapan yang dianggap menghasut, termasuk ucapan yang berkemungkinan menyebabkan gangguan awam, menghina pemerintah, atau mempersoalkan kesahiha...
    How do the surface properties of metal catalysts influence the selectivity and activity of the oxidation reaction of hydrocarbons?
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • num_train_epochs: 10
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0761 500 4.9612
0.1522 1000 0.4336
0.2282 1500 0.0468
0.3043 2000 0.0045
0.3804 2500 0.0033
0.4565 3000 0.001
0.5326 3500 0.0012
0.6086 4000 0.0006
0.6847 4500 0.0002
0.7608 5000 0.0009
0.8369 5500 0.0009
0.9130 6000 0.0004
0.9890 6500 0.0007
1.0651 7000 0.0003
1.1412 7500 0.0011
1.2173 8000 0.0008
1.2934 8500 0.0
1.3694 9000 0.0003
1.4455 9500 0.0004
1.5216 10000 0.0
1.5977 10500 0.0003
1.6738 11000 0.0
1.7498 11500 0.0002
1.8259 12000 0.0003
1.9020 12500 0.0004
1.9781 13000 0.0005
2.0542 13500 0.0001
2.1302 14000 0.0
2.2063 14500 0.0
2.2824 15000 0.0
2.3585 15500 0.0004
2.4346 16000 0.0002
2.5107 16500 0.0003
2.5867 17000 0.001
2.6628 17500 0.0
2.7389 18000 0.0005
2.8150 18500 0.0003
2.8911 19000 0.0
2.9671 19500 0.0001
3.0432 20000 0.0
3.1193 20500 0.0
3.1954 21000 0.0003
3.2715 21500 0.0
3.3475 22000 0.0003
3.4236 22500 0.0
3.4997 23000 0.0
3.5758 23500 0.0
3.6519 24000 0.0
3.7279 24500 0.0
3.8040 25000 0.0003
3.8801 25500 0.0003
3.9562 26000 0.0
4.0323 26500 0.0
4.1083 27000 0.0
4.1844 27500 0.0
4.2605 28000 0.0002
4.3366 28500 0.0
4.4127 29000 0.0
4.4887 29500 0.0003
4.5648 30000 0.0
4.6409 30500 0.0003
4.7170 31000 0.0
4.7931 31500 0.0
4.8691 32000 0.0005
4.9452 32500 0.0
5.0213 33000 0.0
5.0974 33500 0.0
5.1735 34000 0.0
5.2495 34500 0.0003
5.3256 35000 0.0
5.4017 35500 0.0
5.4778 36000 0.0
5.5539 36500 0.0
5.6299 37000 0.0
5.7060 37500 0.0
5.7821 38000 0.0001
5.8582 38500 0.0009
5.9343 39000 0.0
6.0103 39500 0.0
6.0864 40000 0.0
6.1625 40500 0.0
6.2386 41000 0.0004
6.3147 41500 0.0
6.3907 42000 0.0
6.4668 42500 0.0
6.5429 43000 0.0
6.6190 43500 0.0003
6.6951 44000 0.0
6.7712 44500 0.0
6.8472 45000 0.0003
6.9233 45500 0.0
6.9994 46000 0.0
7.0755 46500 0.0
7.1516 47000 0.0
7.2276 47500 0.0
7.3037 48000 0.0003
7.3798 48500 0.0
7.4559 49000 0.0003
7.5320 49500 0.0
7.6080 50000 0.0003
7.6841 50500 0.0
7.7602 51000 0.0
7.8363 51500 0.0
7.9124 52000 0.0
7.9884 52500 0.0
8.0645 53000 0.0
8.1406 53500 0.0003
8.2167 54000 0.0
8.2928 54500 0.0
8.3688 55000 0.0
8.4449 55500 0.0
8.5210 56000 0.0
8.5971 56500 0.0
8.6732 57000 0.0
8.7492 57500 0.0003
8.8253 58000 0.0003
8.9014 58500 0.0
8.9775 59000 0.0
9.0536 59500 0.0
9.1296 60000 0.0
9.2057 60500 0.0
9.2818 61000 0.0
9.3579 61500 0.0
9.4340 62000 0.0
9.5100 62500 0.0
9.5861 63000 0.0006
9.6622 63500 0.0
9.7383 64000 0.0
9.8144 64500 0.0
9.8904 65000 0.0
9.9665 65500 0.0006

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.1
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.0.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Downloads last month
-
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for along26/all-MiniLM-L6-v2_multilingual_malaysian

Finetuned
(757)
this model

Papers for along26/all-MiniLM-L6-v2_multilingual_malaysian