---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:1275
- loss:MultipleNegativesRankingLoss
base_model: sentence-transformers/all-mpnet-base-v2
widget:
- source_sentence: snickers almond
sentences:
- Cheetos Flamin' Hot
- Snickers Almond
- Tostitos Hint of Lime
- source_sentence: hershey's special dark
sentences:
- Hershey's Special Dark Chocolate Bar
- 5-Hour Energy Shot
- Hershey's Milk Chocolate Bar
- source_sentence: goldfish classic
sentences:
- 3 Musketeers Bar
- Goldfish Crackers
- Hot Pockets
- source_sentence: skittles
sentences:
- Black Tea
- Skittles
- Chips Ahoy! Chewy Cookies
- source_sentence: cheddar cheese
sentences:
- Cucumber
- Cheddar Cheese Block
- Coffee-mate Creamer
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. The purpose is to create closer semantic relations with certain snack/food names (ie chips -> potato chips).
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2)
- **Maximum Sequence Length:** 384 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False, 'architecture': 'MPNetModel'})
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Weike1000/Snack_Embed")
# Run inference
sentences = [
'cheddar cheese',
'Cheddar Cheese Block',
'Cucumber',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9452, 0.1340],
# [0.9452, 1.0000, 0.1356],
# [0.1340, 0.1356, 1.0000]])
```
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 1,275 training samples
* Columns: sentence_0 and sentence_1
* Approximate statistics based on the first 1000 samples:
| | sentence_0 | sentence_1 |
|:--------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|
| type | string | string |
| details |
- min: 3 tokens
- mean: 5.33 tokens
- max: 11 tokens
| - min: 3 tokens
- mean: 6.4 tokens
- max: 15 tokens
|
* Samples:
| sentence_0 | sentence_1 |
|:------------------------------|:------------------------------------------------|
| fudge stripes | Keebler Fudge Stripes Cookies |
| gummy bears bag | Gummy Bears |
| kind bar caramel | Kind Bar Caramel Almond & Sea Salt |
* Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `num_train_epochs`: 1000
- `multi_dataset_batch_sampler`: round_robin
#### All Hyperparameters
Click to expand
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1
- `num_train_epochs`: 1000
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `tp_size`: 0
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: round_robin
- `router_mapping`: {}
- `learning_rate_mapping`: {}
### Training Logs
Click to expand
| Epoch | Step | Training Loss |
|:------:|:-----:|:-------------:|
| 6.25 | 500 | 0.0756 |
| 12.5 | 1000 | 0.0396 |
| 18.75 | 1500 | 0.033 |
| 25.0 | 2000 | 0.0283 |
| 31.25 | 2500 | 0.0257 |
| 37.5 | 3000 | 0.0249 |
| 43.75 | 3500 | 0.0248 |
| 50.0 | 4000 | 0.019 |
| 56.25 | 4500 | 0.0242 |
| 62.5 | 5000 | 0.0203 |
| 68.75 | 5500 | 0.0205 |
| 75.0 | 6000 | 0.0225 |
| 81.25 | 6500 | 0.0183 |
| 87.5 | 7000 | 0.0227 |
| 93.75 | 7500 | 0.0224 |
| 100.0 | 8000 | 0.022 |
| 106.25 | 8500 | 0.0244 |
| 112.5 | 9000 | 0.0231 |
| 118.75 | 9500 | 0.021 |
| 125.0 | 10000 | 0.0215 |
| 131.25 | 10500 | 0.0166 |
| 137.5 | 11000 | 0.0186 |
| 143.75 | 11500 | 0.0211 |
| 150.0 | 12000 | 0.0208 |
| 156.25 | 12500 | 0.0214 |
| 162.5 | 13000 | 0.0207 |
| 168.75 | 13500 | 0.0216 |
| 175.0 | 14000 | 0.0214 |
| 181.25 | 14500 | 0.0209 |
| 187.5 | 15000 | 0.0197 |
| 193.75 | 15500 | 0.022 |
| 200.0 | 16000 | 0.0183 |
| 206.25 | 16500 | 0.0189 |
| 212.5 | 17000 | 0.0188 |
| 218.75 | 17500 | 0.0163 |
| 225.0 | 18000 | 0.0209 |
| 231.25 | 18500 | 0.0185 |
| 237.5 | 19000 | 0.0211 |
| 243.75 | 19500 | 0.02 |
| 250.0 | 20000 | 0.0206 |
| 256.25 | 20500 | 0.0222 |
| 262.5 | 21000 | 0.0185 |
| 268.75 | 21500 | 0.0205 |
| 275.0 | 22000 | 0.0165 |
| 281.25 | 22500 | 0.0185 |
| 287.5 | 23000 | 0.0164 |
| 293.75 | 23500 | 0.0191 |
| 300.0 | 24000 | 0.0197 |
| 306.25 | 24500 | 0.0195 |
| 312.5 | 25000 | 0.0185 |
| 318.75 | 25500 | 0.017 |
| 325.0 | 26000 | 0.0184 |
| 331.25 | 26500 | 0.0184 |
| 337.5 | 27000 | 0.0211 |
| 343.75 | 27500 | 0.0182 |
| 350.0 | 28000 | 0.0189 |
| 356.25 | 28500 | 0.0172 |
| 362.5 | 29000 | 0.0195 |
| 368.75 | 29500 | 0.0221 |
| 375.0 | 30000 | 0.0197 |
| 381.25 | 30500 | 0.0228 |
| 387.5 | 31000 | 0.0173 |
| 393.75 | 31500 | 0.0191 |
| 400.0 | 32000 | 0.0203 |
| 406.25 | 32500 | 0.0202 |
| 412.5 | 33000 | 0.0186 |
| 418.75 | 33500 | 0.0178 |
| 425.0 | 34000 | 0.018 |
| 431.25 | 34500 | 0.0192 |
| 437.5 | 35000 | 0.0186 |
| 443.75 | 35500 | 0.0211 |
| 450.0 | 36000 | 0.0209 |
| 456.25 | 36500 | 0.0216 |
| 462.5 | 37000 | 0.0201 |
| 468.75 | 37500 | 0.0227 |
| 475.0 | 38000 | 0.02 |
| 481.25 | 38500 | 0.018 |
| 487.5 | 39000 | 0.0218 |
| 493.75 | 39500 | 0.0237 |
| 500.0 | 40000 | 0.0208 |
| 506.25 | 40500 | 0.0185 |
| 512.5 | 41000 | 0.0188 |
| 518.75 | 41500 | 0.0188 |
| 525.0 | 42000 | 0.0168 |
| 531.25 | 42500 | 0.017 |
| 537.5 | 43000 | 0.0165 |
| 543.75 | 43500 | 0.0197 |
| 550.0 | 44000 | 0.0159 |
| 556.25 | 44500 | 0.0224 |
| 562.5 | 45000 | 0.0179 |
| 568.75 | 45500 | 0.0188 |
| 575.0 | 46000 | 0.0203 |
| 581.25 | 46500 | 0.018 |
| 587.5 | 47000 | 0.0195 |
| 593.75 | 47500 | 0.0194 |
| 600.0 | 48000 | 0.0205 |
| 606.25 | 48500 | 0.0185 |
| 612.5 | 49000 | 0.0208 |
| 618.75 | 49500 | 0.0205 |
| 625.0 | 50000 | 0.0201 |
| 631.25 | 50500 | 0.0175 |
| 637.5 | 51000 | 0.0171 |
| 643.75 | 51500 | 0.0184 |
| 650.0 | 52000 | 0.0228 |
| 656.25 | 52500 | 0.0203 |
| 662.5 | 53000 | 0.0222 |
| 668.75 | 53500 | 0.0188 |
| 675.0 | 54000 | 0.0235 |
| 681.25 | 54500 | 0.0182 |
| 687.5 | 55000 | 0.0215 |
| 693.75 | 55500 | 0.018 |
| 700.0 | 56000 | 0.0227 |
| 706.25 | 56500 | 0.0185 |
| 712.5 | 57000 | 0.0179 |
| 718.75 | 57500 | 0.0176 |
| 725.0 | 58000 | 0.0233 |
| 731.25 | 58500 | 0.0213 |
| 737.5 | 59000 | 0.0208 |
| 743.75 | 59500 | 0.015 |
| 750.0 | 60000 | 0.0199 |
| 756.25 | 60500 | 0.0197 |
| 762.5 | 61000 | 0.0199 |
| 768.75 | 61500 | 0.0209 |
| 775.0 | 62000 | 0.0185 |
| 781.25 | 62500 | 0.0183 |
| 787.5 | 63000 | 0.0169 |
| 793.75 | 63500 | 0.0176 |
| 800.0 | 64000 | 0.0206 |
| 806.25 | 64500 | 0.0186 |
| 812.5 | 65000 | 0.0181 |
| 818.75 | 65500 | 0.0179 |
| 825.0 | 66000 | 0.0184 |
| 831.25 | 66500 | 0.0157 |
| 837.5 | 67000 | 0.0181 |
| 843.75 | 67500 | 0.0174 |
| 850.0 | 68000 | 0.0185 |
| 856.25 | 68500 | 0.0213 |
| 862.5 | 69000 | 0.0181 |
| 868.75 | 69500 | 0.02 |
| 875.0 | 70000 | 0.0141 |
| 881.25 | 70500 | 0.0168 |
| 887.5 | 71000 | 0.0218 |
| 893.75 | 71500 | 0.0188 |
| 900.0 | 72000 | 0.0139 |
| 906.25 | 72500 | 0.0188 |
| 912.5 | 73000 | 0.022 |
| 918.75 | 73500 | 0.0154 |
| 925.0 | 74000 | 0.0165 |
| 931.25 | 74500 | 0.0186 |
| 937.5 | 75000 | 0.0191 |
| 943.75 | 75500 | 0.0188 |
| 950.0 | 76000 | 0.0176 |
| 956.25 | 76500 | 0.0218 |
| 962.5 | 77000 | 0.0185 |
| 968.75 | 77500 | 0.0193 |
| 975.0 | 78000 | 0.0218 |
| 981.25 | 78500 | 0.0161 |
| 987.5 | 79000 | 0.0216 |
| 993.75 | 79500 | 0.0225 |
| 1000.0 | 80000 | 0.0194 |
### Framework Versions
- Python: 3.9.6
- Sentence Transformers: 5.0.0
- Transformers: 4.51.3
- PyTorch: 2.7.0
- Accelerate: 1.7.0
- Datasets: 4.0.0
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```