--- tags: - sentence-transformers - sentence-similarity - feature-extraction - dense - generated_from_trainer - dataset_size:111470 - loss:MultipleNegativesRankingLoss base_model: thenlper/gte-small widget: - source_sentence: why are some rocks radioactive sentences: - Radioactive accessory minerals such as zircon may contribute to the radioactivity of a mineral which is otherwise non-radioactive by calculation. Many granites or other igneous rocks contain some radioactivity because of minor, but highly radioactive, accessory minerals.re = mineral density (S Atomic number / Molecular Weight) where re is the electron density in grams/cc.efinition. Radioactivity in minerals are caused by the inclusion of naturally-occurring radioactive elements in the mineral's composition. The degree of radioactivity is dependent on the concentration and isotope present in the mineral. - Taking B-complex vitamins, which include vitamin B12, can cause urine to have a bright yellow or even orange color, but check with your doctor to be sure that's what is going on in your case. B vitamins are water-soluble vitamins, which means that what your body doesn't use is excreted in your urine. Riboflavin (vitamin B2) is especially likely to cause this color change in urine. Several medications can also turn urine a bright yellow or orange color. Changes in urine color may also signal certain health problems. - Radioactive material is just another name for a group of unstable atoms that emit ionizing radiation. These groups of unstable atoms emit radiation because they try to become stable. Radioactive materials emit radiation in a process called radioactive decay. - source_sentence: How was your experience of Lucid dreaming at home? sentences: - How was your experience of Lucid dreaming at home? - How was your experience of Lucid dreaming outside the home? - "Bournemouth /Ë\x88bÉ\x94É\x99rnmÉ\x99θ/ is a large coastal resort town on the\ \ south coast of England directly to the east of the Jurassic Coast, a 96-mile\ \ (155 km) World Heritage Site. According to the 2011 census, the town has a population\ \ of 183,491 making it the largest settlement in Dorset.he Bournemouth Eye is\ \ a helium-filled balloon attached to a steel cable in the town's lower gardens.\ \ The spherical balloon is 69 m (226 ft) in circumference and carries an enclosed,\ \ steel gondola. Rising to a height of 150 m (492 ft), it provides a panoramic\ \ view of the surrounding area for up to 28 passengers." - source_sentence: what is iraq's dominant religion sentences: - 'If you are working, consider taking maternity leave as early as you can. This makes sense anyway because carrying twins is hard work, and most twins arrive earlier than single babies (NCCWCH 2011: 128) . More than half of twins arrive early, before 37 weeks (NCCWCH 2011: 120, Tamba 2012) .Talk to your midwife or doctor if you are feeling down about your pregnancy (NICE 2011) .f you are working, consider taking maternity leave as early as you can. This makes sense anyway because carrying twins is hard work, and most twins arrive earlier than single babies (NCCWCH 2011: 128) . More than half of twins arrive early, before 37 weeks (NCCWCH 2011: 120, Tamba 2012) .' - "Introduction. Although Iranâ\x80\x99s state religion is Shiite Islam and the\ \ majority of its population is ethnically Persian, millions of minorities from\ \ various ethnic, religious, and linguistic backgrounds also reside in Iran. Among\ \ these groups are ethnic Kurds, Baluchis, and Azeris.lthough Iranâ\x80\x99s state\ \ religion is Shiite Islam and the majority of its population is ethnically Persian,\ \ millions of minorities from various ethnic, religious, and linguistic backgrounds\ \ also reside in Iran." - In today's Republic of Iraq, where Islam is the state religion and claims the beliefs of 95 percent of the population, the majority of Iraqis identify with Arab culture. The second-largest cultural group is the Kurds, who are in the highlands and mountain valleys of the north in a politically autonomous settlement. - source_sentence: how many years of education are needed to become a pediatric nurse sentences: - In terms of educational background, pediatric nurse requirements include either an Associate's or a Bachelor's degree in Nursing. An Associate's degree (ADN) typically takes two years to complete, while a Bachelor's degree (BSN) typically takes four years. ADN programs are usually offered by community colleges. - "Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA, United\ \ States Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA,\ \ United States Living area with king bed by Monique' M. â\x80\x9CAnd there's\ \ a complimentary reception with 2 drinks, soup and salad bar nightly.â\x80\x9D\ \ in 2 reviews" - 'From there, additional training specific to the care of children is required. Pediatric nurses can become certified in the field and may choose to further specialize in a particular area. Program Levels: Associate''s degree, bachelor''s degree.' - source_sentence: Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias . sentences: - IBM banned the usage of the POWER5+ in its System p5 510Q, 520Q, 550Q and 560Q servers. - Schliemann cleared five shafts and recognized them as the graves mentioned by Pausania . - Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias . pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 model-index: - name: SentenceTransformer based on thenlper/gte-small results: - task: type: information-retrieval name: Information Retrieval dataset: name: NanoMSMARCO type: NanoMSMARCO metrics: - type: cosine_accuracy@1 value: 0.3 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.54 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.62 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.76 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.3 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.18 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.124 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07600000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.3 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.54 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.62 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.76 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.5241190384704345 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.4492698412698413 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.45777964902887497 name: Cosine Map@100 - task: type: information-retrieval name: Information Retrieval dataset: name: NanoNQ type: NanoNQ metrics: - type: cosine_accuracy@1 value: 0.38 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.52 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.54 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.68 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.38 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1733333333333333 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11200000000000002 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07200000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.35 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.49 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.52 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.66 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.5017561161582912 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.46857142857142864 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.4585943213547632 name: Cosine Map@100 - task: type: nano-beir name: Nano BEIR dataset: name: NanoBEIR mean type: NanoBEIR_mean metrics: - type: cosine_accuracy@1 value: 0.33999999999999997 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.53 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.5800000000000001 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.72 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.33999999999999997 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.17666666666666664 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11800000000000001 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.07400000000000001 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.32499999999999996 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.515 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.5700000000000001 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.71 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.5129375773143628 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.45892063492063495 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.4581869851918191 name: Cosine Map@100 --- # SentenceTransformer based on thenlper/gte-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [thenlper/gte-small](https://huggingface.co/thenlper/gte-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [thenlper/gte-small](https://huggingface.co/thenlper/gte-small) - **Maximum Sequence Length:** 128 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'}) (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("redis/model-b-structured") # Run inference sentences = [ 'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .', 'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .', 'Schliemann cleared five shafts and recognized them as the graves mentioned by Pausania .', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities) # tensor([[1.0000, 1.0000, 0.9779], # [1.0000, 1.0000, 0.9779], # [0.9779, 0.9779, 1.0000]]) ``` ## Evaluation ### Metrics #### Information Retrieval * Datasets: `NanoMSMARCO` and `NanoNQ` * Evaluated with [InformationRetrievalEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | NanoMSMARCO | NanoNQ | |:--------------------|:------------|:-----------| | cosine_accuracy@1 | 0.3 | 0.38 | | cosine_accuracy@3 | 0.54 | 0.52 | | cosine_accuracy@5 | 0.62 | 0.54 | | cosine_accuracy@10 | 0.76 | 0.68 | | cosine_precision@1 | 0.3 | 0.38 | | cosine_precision@3 | 0.18 | 0.1733 | | cosine_precision@5 | 0.124 | 0.112 | | cosine_precision@10 | 0.076 | 0.072 | | cosine_recall@1 | 0.3 | 0.35 | | cosine_recall@3 | 0.54 | 0.49 | | cosine_recall@5 | 0.62 | 0.52 | | cosine_recall@10 | 0.76 | 0.66 | | **cosine_ndcg@10** | **0.5241** | **0.5018** | | cosine_mrr@10 | 0.4493 | 0.4686 | | cosine_map@100 | 0.4578 | 0.4586 | #### Nano BEIR * Dataset: `NanoBEIR_mean` * Evaluated with [NanoBEIREvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator) with these parameters: ```json { "dataset_names": [ "msmarco", "nq" ], "dataset_id": "lightonai/NanoBEIR-en" } ``` | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.34 | | cosine_accuracy@3 | 0.53 | | cosine_accuracy@5 | 0.58 | | cosine_accuracy@10 | 0.72 | | cosine_precision@1 | 0.34 | | cosine_precision@3 | 0.1767 | | cosine_precision@5 | 0.118 | | cosine_precision@10 | 0.074 | | cosine_recall@1 | 0.325 | | cosine_recall@3 | 0.515 | | cosine_recall@5 | 0.57 | | cosine_recall@10 | 0.71 | | **cosine_ndcg@10** | **0.5129** | | cosine_mrr@10 | 0.4589 | | cosine_map@100 | 0.4582 | ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 111,470 training samples * Columns: anchor, positive, and negative * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | positive | negative | |:----------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | how far is sandos caracol eco resort from cancun airport | The Sandos Caracol Eco Resort is 2 miles from the Church of Guadalupe and a 45-minute drive from Cancun Cancún. Airport The Gran Coral Golf Riviera maya is located within the same estate as The. Sandos we speak your! Language Hotel: rooms, 680 Hotel: Chain Sandos & Hotels. resorts | Featuring a spa, 8 restaurants and 2 outdoor pools, Sandos Caracol Eco Resort is set on Playa del Carmen Beach, overlooking Cozumel Island. Its rooms have balconies overlooking the Caribbean Sea. Sandos Caracol Eco Resort is in beautiful gardens and features bright accommodations. | | can eggs expire | Here is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration.ere is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration. | The answer to this question may surprise you: while uncooked eggs typically last four to five weeks when properly refrigerated, hard-boiled eggs will only last about a week. This is because egg shells, which are highly porous, are sprayed before sale with a thin coating of mineral oil that seals the egg. | | how old are first graders? | First Grade Worksheets Online. 6 and 7 year old kids get their first taste of real schooling in first grade. Help children learn the basics in math, reading, language and science with our printable first grade worksheets. Spelling Worksheets for 1st Grade. | Average BMI percentile-for-age values were 59.5 (28.8) for first-graders, 59.5 (30.5) for third-graders, and 62.4 (31.7) for fifth-graders. The number of participants classified as obese was 144 (25.6% of first-graders, 28.5% of third-graders, and 34.5% of fifth-graders). The percentage of students who reported a reasonable height or weight ranged from 20% (first grade, height) to 92% (fifth grade, weight) (Table). In general, self-report ability was better in older children and when self-reporting weight. | * Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 7.0, "similarity_fct": "cos_sim", "gather_across_devices": false } ``` ### Evaluation Dataset #### Unnamed Dataset * Size: 12,386 evaluation samples * Columns: anchor, positive, and negative * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | positive | negative | |:----------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | In 1883 , the first schools were built in the vicinity for 400 white and 60 black students . | In 1883 , the first schools were built in the vicinity for 400 white and 60 black students . | In 1883 , the first schools in the area were built for 400 black students and 60 white students . | | what is the origin of the name haja | Haja is a Muslim baby Girl name, it is an Arabic originated name. Haja name meaning is In the heart condition through and the lucky number associated with Haja is 5. Find all the relevant details about the Haja Meaning, Origin, Lucky Number and Religion from this page. Average rating of Haja is 1 stars, based on 0 reviews. | Synonomis with the exclamation commonly used in urban circles Holla. Haba is derived from the term, Holla Bitches, which became Haba Litches, which eventually evolved to Habalicious, and finally became just Haba. When seeing a fine female passing by, Russell exclaimed, Haba. | | what causes itch rash | A rash is a noticeable change in the texture or color of the skin. The skin may become itchy, bumpy, chapped, scaly, or otherwise irritated. Rashes are caused by a wide range of conditions, including allergies, medication, cosmetics, and various diseases. The rash is often reddish and itchy, with a scaly texture. 2 bug bites: tick bites are of particular concern, as they can transmit disease. 3 psoriasis: a scaly, itchy, red rash that forms along the scalp and joints. 4 dandruff: an itchy, flaky rash on the scalp. | Causes of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes).3 Knee pain (122 causes). 4 Knee tingling (6 causes). 5 Knee symptoms (149 causes). 6 Skin itch (1068 causes). 7 Skin rash (461 causes). 8 Insect bite.auses of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes). | * Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 7.0, "similarity_fct": "cos_sim", "gather_across_devices": false } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 128 - `per_device_eval_batch_size`: 128 - `learning_rate`: 1e-06 - `weight_decay`: 0.001 - `max_steps`: 3000 - `warmup_ratio`: 0.1 - `fp16`: True - `dataloader_drop_last`: True - `dataloader_num_workers`: 1 - `dataloader_prefetch_factor`: 1 - `load_best_model_at_end`: True - `optim`: adamw_torch - `ddp_find_unused_parameters`: False - `push_to_hub`: True - `hub_model_id`: redis/model-b-structured - `eval_on_start`: True #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 128 - `per_device_eval_batch_size`: 128 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-06 - `weight_decay`: 0.001 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3.0 - `max_steps`: 3000 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: True - `dataloader_num_workers`: 1 - `dataloader_prefetch_factor`: 1 - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `parallelism_config`: None - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `project`: huggingface - `trackio_space_id`: trackio - `ddp_find_unused_parameters`: False - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: True - `resume_from_checkpoint`: None - `hub_model_id`: redis/model-b-structured - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `hub_revision`: None - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: no - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: True - `use_liger_kernel`: False - `liger_kernel_config`: None - `eval_use_gather_object`: False - `average_tokens_across_devices`: True - `prompts`: None - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional - `router_mapping`: {} - `learning_rate_mapping`: {}
### Training Logs | Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 | |:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:| | 0 | 0 | - | 4.0678 | 0.6259 | 0.6583 | 0.6421 | | 0.2874 | 250 | 4.2246 | 3.8520 | 0.6117 | 0.6465 | 0.6291 | | 0.5747 | 500 | 3.8138 | 3.1367 | 0.6062 | 0.6457 | 0.6260 | | 0.8621 | 750 | 2.9174 | 1.8442 | 0.5837 | 0.5594 | 0.5715 | | 1.1494 | 1000 | 1.8256 | 1.2096 | 0.5462 | 0.4989 | 0.5226 | | 1.4368 | 1250 | 1.4465 | 1.0779 | 0.5347 | 0.4650 | 0.4998 | | 1.7241 | 1500 | 1.3307 | 1.0331 | 0.5358 | 0.4801 | 0.5079 | | 2.0115 | 1750 | 1.2785 | 1.0094 | 0.5359 | 0.4848 | 0.5104 | | 2.2989 | 2000 | 1.249 | 0.9957 | 0.5282 | 0.4860 | 0.5071 | | 2.5862 | 2250 | 1.228 | 0.9865 | 0.5245 | 0.4939 | 0.5092 | | 2.8736 | 2500 | 1.2043 | 0.9809 | 0.5235 | 0.5018 | 0.5126 | | 3.1609 | 2750 | 1.208 | 0.9771 | 0.5261 | 0.5018 | 0.5139 | | 3.4483 | 3000 | 1.2008 | 0.9762 | 0.5241 | 0.5018 | 0.5129 | ### Framework Versions - Python: 3.10.18 - Sentence Transformers: 5.2.0 - Transformers: 4.57.3 - PyTorch: 2.9.1+cu128 - Accelerate: 1.12.0 - Datasets: 2.21.0 - Tokenizers: 0.22.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```