metadata
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:82069
- loss:MSELoss
base_model: sentence-transformers/LaBSE
widget:
- source_sentence: Kendi kendine yardım etsen Tanrı da sana yardımcı olur.
sentences:
- nasin sina li pona seme?
- ona li jan sona.
- o pana e pona tawa sama sina la mama sewi li pana e pona tawa sina.
- source_sentence: 星星在夜里出现。
sentences:
- tenpo pimeja la mun li kama.
- sina unpa lukin kin.
- 'jan ni li toki ike tan mi. ona li wile ala pilin e ni: mi kama tawa ona.'
- source_sentence: 오렌지 주스를 좀 주세요.
sentences:
- mi ken la mi pana e pona tawa sina.
- jan Ton en jan Mewi li tawa ma Oselija lon tenpo sike pini.
- mi wile e telo pi kili jelo.
- source_sentence: Ён чакае адпраўлення цягніка.
sentences:
- 'ona li awen tawa ni: tomo tawa linja li tawa weka.'
- lon ma tomo sina la telo li kama ala kama tan sewi?
- 'jan li sona ala e ni: tenpo seme la utala li pini.'
- source_sentence: 我只想暖和一下。
sentences:
- mi wile kama seli taso.
- tomo tawa sina li lon ni.
- mi moku e kili.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- negative_mse
- src2trg_accuracy
- trg2src_accuracy
- mean_accuracy
model-index:
- name: SentenceTransformer based on sentence-transformers/LaBSE
results:
- task:
type: knowledge-distillation
name: Knowledge Distillation
dataset:
name: eval data
type: eval_data
metrics:
- type: negative_mse
value: -0.04922879859805107
name: Negative Mse
- task:
type: translation
name: Translation
dataset:
name: eval data
type: eval_data
metrics:
- type: src2trg_accuracy
value: 0.8380595265994845
name: Src2Trg Accuracy
- type: trg2src_accuracy
value: 0.7736114366065151
name: Trg2Src Accuracy
- type: mean_accuracy
value: 0.8058354816029998
name: Mean Accuracy
datasets:
- NetherQuartz/tatoeba-tokipona
- NetherQuartz/lipu-sewi
- NetherQuartz/minecraft-translations
language:
- tok
- en
- ru
- uk
- be
- fr
- es
- pt
- it
- de
- vi
- ja
- zh
- ko
- ar
- he
- pl
- tr
- la
- el
SentenceTransformer based on sentence-transformers/LaBSE
This is a sentence-transformers model finetuned from sentence-transformers/LaBSE on Toki Pona dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, bitext mining and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/LaBSE
- Maximum Sequence Length: 256 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 768, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
(3): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("NetherQuartz/LaBSE-tokipona")
# Run inference
sentences = [
'我只想暖和一下。',
'mi wile kama seli taso.',
'tomo tawa sina li lon ni.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.8126, 0.2629],
# [0.8126, 1.0000, 0.4022],
# [0.2629, 0.4022, 1.0000]])
Evaluation
Metrics
Knowledge Distillation
- Dataset:
eval_data - Evaluated with
MSEEvaluator
| Metric | Value |
|---|---|
| negative_mse | -0.0492 |
Translation
- Dataset:
eval_data - Evaluated with
TranslationEvaluator
| Metric | Value |
|---|---|
| src2trg_accuracy | 0.8381 |
| trg2src_accuracy | 0.7736 |
| mean_accuracy | 0.8058 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 82,069 training samples
- Columns:
natural,tok, andlabel - Approximate statistics based on the first 1000 samples:
natural tok label type string string list details - min: 3 tokens
- mean: 10.66 tokens
- max: 43 tokens
- min: 4 tokens
- mean: 13.96 tokens
- max: 47 tokens
- size: 768 elements
- Samples:
natural tok label Я держу руку.mi sewi e luka mi.[-0.02380160242319107, -0.05106028914451599, -0.054335981607437134, -0.050830986350774765, -0.05793563649058342, ...]Я змарыўся ад працы.tan pali mi la mi pilin lape.[-0.03956804797053337, 0.008637639693915844, -0.045203790068626404, -0.06055501475930214, -0.06817363947629929, ...]Mi bolso necesita ser reparado.poki mi li pakala.[-0.06031221151351929, -0.006717301905155182, -0.03342252969741821, -0.03583073988556862, -0.0651949867606163, ...] - Loss:
MSELoss
Evaluation Dataset
Unnamed Dataset
- Size: 4,267 evaluation samples
- Columns:
natural,tok, andlabel - Approximate statistics based on the first 1000 samples:
natural tok label type string string list details - min: 5 tokens
- mean: 10.59 tokens
- max: 43 tokens
- min: 4 tokens
- mean: 13.7 tokens
- max: 59 tokens
- size: 768 elements
- Samples:
natural tok label Da quanto tempo sei/state in Germania?tenpo pi suli seme la sina lon ma Tosi?[-0.025023534893989563, -0.0016661343397572637, -0.02266993746161461, -0.061682481318712234, -0.035705942660570145, ...]Habesne difficultatem hac re?ni li ike tawa sina anu seme?[0.033313240855932236, -0.04223407432436943, -0.012467658147215843, -0.06204398348927498, -0.06461521983146667, ...]אני לא הולך להפסיד.mi kama ala anpa.[-0.05783797428011894, -0.04036393761634827, -0.0631723552942276, -0.03369426354765892, -0.05813731253147125, ...] - Loss:
MSELoss
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 64per_device_eval_batch_size: 64learning_rate: 2e-05num_train_epochs: 12warmup_ratio: 0.1fp16: Trueload_best_model_at_end: True
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 64per_device_eval_batch_size: 64per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 2e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1.0num_train_epochs: 12max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.1warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Trueignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torch_fusedoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsehub_revision: Nonegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseliger_kernel_config: Noneeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: proportionalrouter_mapping: {}learning_rate_mapping: {}
Training Logs
Click to expand
| Epoch | Step | Training Loss | Validation Loss | eval_data_negative_mse | eval_data_mean_accuracy |
|---|---|---|---|---|---|
| 0.0779 | 100 | 0.0009 | - | - | - |
| 0.1559 | 200 | 0.0007 | - | - | - |
| 0.2338 | 300 | 0.0006 | - | - | - |
| 0.3118 | 400 | 0.0006 | - | - | - |
| 0.3897 | 500 | 0.0005 | - | - | - |
| 0.4677 | 600 | 0.0005 | - | - | - |
| 0.5456 | 700 | 0.0005 | - | - | - |
| 0.6235 | 800 | 0.0004 | - | - | - |
| 0.7015 | 900 | 0.0004 | - | - | - |
| 0.7794 | 1000 | 0.0004 | - | - | - |
| 0.8574 | 1100 | 0.0004 | - | - | - |
| 0.9353 | 1200 | 0.0004 | - | - | - |
| 1.0133 | 1300 | 0.0004 | - | - | - |
| 1.0912 | 1400 | 0.0004 | - | - | - |
| 1.1691 | 1500 | 0.0004 | - | - | - |
| 1.2471 | 1600 | 0.0004 | - | - | - |
| 1.3250 | 1700 | 0.0003 | - | - | - |
| 1.4030 | 1800 | 0.0003 | - | - | - |
| 1.4809 | 1900 | 0.0003 | - | - | - |
| 1.5588 | 2000 | 0.0003 | 0.0003 | -0.0572 | 0.7559 |
| 1.6368 | 2100 | 0.0003 | - | - | - |
| 1.7147 | 2200 | 0.0003 | - | - | - |
| 1.7927 | 2300 | 0.0003 | - | - | - |
| 1.8706 | 2400 | 0.0003 | - | - | - |
| 1.9486 | 2500 | 0.0003 | - | - | - |
| 2.0265 | 2600 | 0.0003 | - | - | - |
| 2.1044 | 2700 | 0.0003 | - | - | - |
| 2.1824 | 2800 | 0.0003 | - | - | - |
| 2.2603 | 2900 | 0.0003 | - | - | - |
| 2.3383 | 3000 | 0.0003 | - | - | - |
| 2.4162 | 3100 | 0.0003 | - | - | - |
| 2.4942 | 3200 | 0.0003 | - | - | - |
| 2.5721 | 3300 | 0.0003 | - | - | - |
| 2.6500 | 3400 | 0.0003 | - | - | - |
| 2.7280 | 3500 | 0.0003 | - | - | - |
| 2.8059 | 3600 | 0.0003 | - | - | - |
| 2.8839 | 3700 | 0.0003 | - | - | - |
| 2.9618 | 3800 | 0.0003 | - | - | - |
| 3.0398 | 3900 | 0.0003 | - | - | - |
| 3.1177 | 4000 | 0.0003 | 0.0003 | -0.0528 | 0.7850 |
| 3.1956 | 4100 | 0.0003 | - | - | - |
| 3.2736 | 4200 | 0.0003 | - | - | - |
| 3.3515 | 4300 | 0.0003 | - | - | - |
| 3.4295 | 4400 | 0.0003 | - | - | - |
| 3.5074 | 4500 | 0.0003 | - | - | - |
| 3.5853 | 4600 | 0.0003 | - | - | - |
| 3.6633 | 4700 | 0.0003 | - | - | - |
| 3.7412 | 4800 | 0.0003 | - | - | - |
| 3.8192 | 4900 | 0.0003 | - | - | - |
| 3.8971 | 5000 | 0.0003 | - | - | - |
| 3.9751 | 5100 | 0.0003 | - | - | - |
| 4.0530 | 5200 | 0.0003 | - | - | - |
| 4.1309 | 5300 | 0.0003 | - | - | - |
| 4.2089 | 5400 | 0.0003 | - | - | - |
| 4.2868 | 5500 | 0.0003 | - | - | - |
| 4.3648 | 5600 | 0.0003 | - | - | - |
| 4.4427 | 5700 | 0.0003 | - | - | - |
| 4.5207 | 5800 | 0.0003 | - | - | - |
| 4.5986 | 5900 | 0.0003 | - | - | - |
| 4.6765 | 6000 | 0.0003 | 0.0003 | -0.0512 | 0.7936 |
| 4.7545 | 6100 | 0.0003 | - | - | - |
| 4.8324 | 6200 | 0.0003 | - | - | - |
| 4.9104 | 6300 | 0.0003 | - | - | - |
| 4.9883 | 6400 | 0.0003 | - | - | - |
| 5.0663 | 6500 | 0.0003 | - | - | - |
| 5.1442 | 6600 | 0.0003 | - | - | - |
| 5.2221 | 6700 | 0.0003 | - | - | - |
| 5.3001 | 6800 | 0.0003 | - | - | - |
| 5.3780 | 6900 | 0.0003 | - | - | - |
| 5.4560 | 7000 | 0.0003 | - | - | - |
| 5.5339 | 7100 | 0.0003 | - | - | - |
| 5.6118 | 7200 | 0.0003 | - | - | - |
| 5.6898 | 7300 | 0.0003 | - | - | - |
| 5.7677 | 7400 | 0.0003 | - | - | - |
| 5.8457 | 7500 | 0.0003 | - | - | - |
| 5.9236 | 7600 | 0.0003 | - | - | - |
| 6.0016 | 7700 | 0.0003 | - | - | - |
| 6.0795 | 7800 | 0.0003 | - | - | - |
| 6.1574 | 7900 | 0.0003 | - | - | - |
| 6.2354 | 8000 | 0.0003 | 0.0003 | -0.0504 | 0.8022 |
| 6.3133 | 8100 | 0.0003 | - | - | - |
| 6.3913 | 8200 | 0.0003 | - | - | - |
| 6.4692 | 8300 | 0.0003 | - | - | - |
| 6.5472 | 8400 | 0.0003 | - | - | - |
| 6.6251 | 8500 | 0.0003 | - | - | - |
| 6.7030 | 8600 | 0.0003 | - | - | - |
| 6.7810 | 8700 | 0.0003 | - | - | - |
| 6.8589 | 8800 | 0.0003 | - | - | - |
| 6.9369 | 8900 | 0.0003 | - | - | - |
| 7.0148 | 9000 | 0.0003 | - | - | - |
| 7.0928 | 9100 | 0.0003 | - | - | - |
| 7.1707 | 9200 | 0.0003 | - | - | - |
| 7.2486 | 9300 | 0.0003 | - | - | - |
| 7.3266 | 9400 | 0.0003 | - | - | - |
| 7.4045 | 9500 | 0.0003 | - | - | - |
| 7.4825 | 9600 | 0.0003 | - | - | - |
| 7.5604 | 9700 | 0.0003 | - | - | - |
| 7.6383 | 9800 | 0.0003 | - | - | - |
| 7.7163 | 9900 | 0.0003 | - | - | - |
| 7.7942 | 10000 | 0.0003 | 0.0003 | -0.0497 | 0.8034 |
| 7.8722 | 10100 | 0.0003 | - | - | - |
| 7.9501 | 10200 | 0.0003 | - | - | - |
| 8.0281 | 10300 | 0.0003 | - | - | - |
| 8.1060 | 10400 | 0.0003 | - | - | - |
| 8.1839 | 10500 | 0.0003 | - | - | - |
| 8.2619 | 10600 | 0.0003 | - | - | - |
| 8.3398 | 10700 | 0.0003 | - | - | - |
| 8.4178 | 10800 | 0.0003 | - | - | - |
| 8.4957 | 10900 | 0.0003 | - | - | - |
| 8.5737 | 11000 | 0.0003 | - | - | - |
| 8.6516 | 11100 | 0.0003 | - | - | - |
| 8.7295 | 11200 | 0.0003 | - | - | - |
| 8.8075 | 11300 | 0.0003 | - | - | - |
| 8.8854 | 11400 | 0.0003 | - | - | - |
| 8.9634 | 11500 | 0.0003 | - | - | - |
| 9.0413 | 11600 | 0.0003 | - | - | - |
| 9.1193 | 11700 | 0.0003 | - | - | - |
| 9.1972 | 11800 | 0.0003 | - | - | - |
| 9.2751 | 11900 | 0.0003 | - | - | - |
| 9.3531 | 12000 | 0.0003 | 0.0003 | -0.0495 | 0.8049 |
| 9.4310 | 12100 | 0.0003 | - | - | - |
| 9.5090 | 12200 | 0.0003 | - | - | - |
| 9.5869 | 12300 | 0.0003 | - | - | - |
| 9.6648 | 12400 | 0.0003 | - | - | - |
| 9.7428 | 12500 | 0.0003 | - | - | - |
| 9.8207 | 12600 | 0.0003 | - | - | - |
| 9.8987 | 12700 | 0.0003 | - | - | - |
| 9.9766 | 12800 | 0.0003 | - | - | - |
| 10.0546 | 12900 | 0.0003 | - | - | - |
| 10.1325 | 13000 | 0.0003 | - | - | - |
| 10.2104 | 13100 | 0.0003 | - | - | - |
| 10.2884 | 13200 | 0.0003 | - | - | - |
| 10.3663 | 13300 | 0.0003 | - | - | - |
| 10.4443 | 13400 | 0.0003 | - | - | - |
| 10.5222 | 13500 | 0.0003 | - | - | - |
| 10.6002 | 13600 | 0.0003 | - | - | - |
| 10.6781 | 13700 | 0.0003 | - | - | - |
| 10.7560 | 13800 | 0.0003 | - | - | - |
| 10.8340 | 13900 | 0.0003 | - | - | - |
| 10.9119 | 14000 | 0.0003 | 0.0003 | -0.0492 | 0.8058 |
| 10.9899 | 14100 | 0.0003 | - | - | - |
| 11.0678 | 14200 | 0.0003 | - | - | - |
| 11.1458 | 14300 | 0.0003 | - | - | - |
| 11.2237 | 14400 | 0.0003 | - | - | - |
| 11.3016 | 14500 | 0.0003 | - | - | - |
| 11.3796 | 14600 | 0.0003 | - | - | - |
| 11.4575 | 14700 | 0.0003 | - | - | - |
| 11.5355 | 14800 | 0.0003 | - | - | - |
| 11.6134 | 14900 | 0.0003 | - | - | - |
| 11.6913 | 15000 | 0.0003 | - | - | - |
| 11.7693 | 15100 | 0.0003 | - | - | - |
| 11.8472 | 15200 | 0.0003 | - | - | - |
| 11.9252 | 15300 | 0.0003 | - | - | - |
| 12.0 | 15396 | - | 0.0003 | -0.0492 | 0.8058 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.13.7
- Sentence Transformers: 5.3.0
- Transformers: 4.55.2
- PyTorch: 2.10.0+cu128
- Accelerate: 1.13.0
- Datasets: 4.7.0
- Tokenizers: 0.21.4
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MSELoss
@inproceedings{reimers-2020-multilingual-sentence-bert,
title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2004.09813",
}