bobox's picture
Add custom GQA pooler model with MarginMSELoss
604b4ef verified
metadata
language:
  - en
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:350048
  - loss:MarginMSELoss
base_model: BAAI/bge-m3
widget:
  - source_sentence: >-
      A typical training station of the Royal Air Force ( not flying ) will have
      the following integrated wing-based structure :
    sentences:
      - >-
        The studios , opened in 2008 , were designed by Zach Hancock and are
        maintained by chief engineer Martin Pilchner .
      - >-
        The approval from the European regulator, EASA, means Blue Islands can
        train their flight and cabin crew to be able to operate and work on
        their ATR aircraft.

        The training will be completed at their operational base in Jersey.

        Rob Veron, managing director of Blue Islands said he was delighted with
        the approval.

        He said: "This is a huge achievement for our operational team as we are
        the only airline in the Channel Islands to have gained this approval.

        "This means we no longer have to send away any of our locally based
        crew, we can be more dynamic with our programmes and further cement our
        roots here in the islands."

        Blue Islands say a typical training programme would include aircraft
        training on the ATR plane, in the simulator and ground-school training.
      - >-
        A typical Royal Air Force training station ( not flying ) will have the
        following integrated wing-based structure :
      - >-
        In 1974 the museum had acquired what is now the Henry Cole wing from the
        Royal College of Science.
  - source_sentence: How does the body naturally remove whiteheads?
    sentences:
      - >-
        The lid was removed only by privileged viewers , so these scenes might
        have been more intimate .
      - >-
        > Whiteheads appear as the body is pushing the 'toxins' out of the body
        to the surface. No, whiteheads are blocked sweat glands or sebaceous
        ducts. They are not "toxins".  > But you're always told not to burst
        them so why go to the surface? The normal function of those glands is to
        exude things onto the surface of the skin, which is why they go to the
        surface. Popping them makes an open wound which can become infected. As
        a general rule of thumb whenever someone starts talking about "toxins"
        you can ignore them as spouting pseudo science nonsense.
      - >-
        The most important apicoplast function is isopentenyl pyrophosphate
        synthesis—in fact, apicomplexans die when something interferes with this
        apicoplast function, and when apicomplexans are grown in an isopentenyl
        pyrophosphate-rich medium, they dump the organelle.
      - >-
        Parts of the immune system of higher organisms create peroxide,
        superoxide, and singlet oxygen to destroy invading microbes.
  - source_sentence: As Frankenstein dies , the monster appears in his room .
    sentences:
      - >-
        Consequently , the transmutation backfires and in law with equivalent
        exchange , Edward�s left leg and right arm , and Alphonse�s entire body
        are destroyed .
      - >-
        Gabriel takes Farrell and escapes with the data before McClane can reach
        him .
      - Andy sees Robert and Katie together and tells Daz .
      - Frankenstein dies . The monster sees Frankenstein .
  - source_sentence: >-
      Why has the guitar risen to become the "marquee" instrument in nearly all
      popular music?
    sentences:
      - >-
        The Strokes are an American rock band formed in New York City in 1998 ,
        consisting of Julian Casablancas ( lead vocals ) , Nick Valensi ( guitar
        , keyboard , backing vocals ) , Albert Hammond , Jr. ( rhythm guitar ,
        keyboard , backing vocals ) , Nikolai Fraiture ( bass ) and Fabrizio
        Moretti ( drums , percussion ) .Met with wide-spread critical acclaim ,
        the Strokes ' 2001 debut , Is This It in 2001 , helped usher in the
        garage rock revival movement of the early-21st century�and ranks number
        eight on Rolling Stone 's 100 Best Debut Albums of All Time , number two
        on Rolling Stone 's 100 Best Albums of the 2000s , and 199 on Rolling
        Stones 500 Greatest Albums of All Time .
      - >-
        They are overlaid with colourless music played live , and consist of
        vague , almost deliberately sparse orchestral sounds .
      - >-
        One main reason is amplification. An electric guitar is *loud* and can
        cut be heard well even when there's a large audience. With the advent of
        Rock and Roll this allowed the electric guitar to replace the saxophone,
        which had been the main lead instrument in Jazz and Rythm and Blues,
        which were the main popular styles at the time. This is discussed in
        Michael Segell's "The Devil's Horn, The Story of the Saxophone"
      - >-
        As explained by my freshman psychology instructor in the sixties:  Lot
        of other instruments, say keyboards, drums, violin are like a barrier or
        pin you down or hide your face. The guitar is like striding around the
        stage waving your big dick at everybody.
  - source_sentence: What year was Jamukha elected Gür Khan?
    sentences:
      - >-
        After several battles, Jamukha was finally turned over to Temüjin by his
        own men in 1206.
      - >-
        In 1991 before 1995 he was a master ( together with Armen Dzhigarkhanyan
        ) acting course at VGIK , he taught at GITIS .
      - >-
        Voters went to the polls in Thailand, five years after the military
        seized power in a coup.
      - >-
        The election held in 1988 saw the advent of the mlolongo (queuing)
        system, where voters were supposed to line up behind their favoured
        candidates instead of a secret ballot.
datasets:
  - bobox/STS_retrieval_dataset_HN_scored
pipeline_tag: sentence-similarity
library_name: sentence-transformers

SentenceTransformer based on BAAI/bge-m3

This is a sentence-transformers model finetuned from BAAI/bge-m3 on the sts_retrieval_dataset_hn_scored dataset. It maps sentences & paragraphs to a 2048-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-m3
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 2048 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
  • Language: en

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'XLMRobertaModel'})
  (1): CustomPooler(
    (ln_queries): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
    (ln_tokens): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
    (q_proj): Linear(in_features=1024, out_features=2048, bias=True)
    (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
    (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
    (o_proj): Linear(in_features=2048, out_features=1024, bias=True)
    (attn_drop): Dropout(p=0.05, inplace=False)
    (fusion_proj): Linear(in_features=4096, out_features=1024, bias=False)
    (mlp): SwiGLU(
      (gate_proj): Linear(in_features=2048, out_features=3072, bias=True)
      (up_proj): Linear(in_features=2048, out_features=3072, bias=True)
      (down_proj): Linear(in_features=3072, out_features=2048, bias=True)
      (drop): Dropout(p=0.05, inplace=False)
    )
    (output_ln): LayerNorm((2048,), eps=1e-05, elementwise_affine=True)
  )
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/custom-pooler-marginmse-v0")
# Run inference
sentences = [
    'What year was Jamukha elected Gür Khan?',
    'After several battles, Jamukha was finally turned over to Temüjin by his own men in 1206.',
    'Voters went to the polls in Thailand, five years after the military seized power in a coup.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 2048]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9932, 0.9922],
#         [0.9932, 1.0000, 0.9952],
#         [0.9922, 0.9952, 1.0000]])

Training Details

Training Dataset

sts_retrieval_dataset_hn_scored

  • Dataset: sts_retrieval_dataset_hn_scored at 38e5944
  • Size: 350,048 training samples
  • Columns: sentence1, sentence2, sentence3, sentence4, sentence5, and label
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 sentence3 sentence4 sentence5 label
    type string string string string string list
    details
    • min: 6 tokens
    • mean: 23.11 tokens
    • max: 72 tokens
    • min: 14 tokens
    • mean: 120.21 tokens
    • max: 275 tokens
    • min: 14 tokens
    • mean: 108.98 tokens
    • max: 285 tokens
    • min: 9 tokens
    • mean: 119.11 tokens
    • max: 442 tokens
    • min: 10 tokens
    • mean: 120.5 tokens
    • max: 427 tokens
    • size: 4 elements
  • Samples:
    sentence1 sentence2 sentence3 sentence4 sentence5 label
    Why is depth perception easier with 2 eyes but worse with 1? Depth perception is given by the brain perceiving the difference between the image of the 2 eyes. If an image is further away, there is less difference between the image the right eye sees and the image the left eye sees. The closer it is the more the image is different. Imagine a twig. Very far away you will see it the same. Very close and the right eye sees the front and right side, the left eye sees front and left side. Therefore the brain knows that is closer. With one eye we use clues. So if we see a full image of a car and a man's chest and head above it, we assume the car is Infront of the man rather than someone with no legs has been thrown in the air Infront of the car. If I show you a small cow you assume it is far away rather than a tiny cow close up. This isn't depth perception, just depth clues It's not really that different. Your field of vision is wider, but your eyes automatically combine the images so other than that it's similar. If I close one eye, the image is about 25% narrower but other than that not a huge difference. Depth perception only works with two eyes, but depth perception is subtle and only makes a big difference in rare cases. You can usually tell how far away things are by the context alone. If depth perception drastically altered an image, movies and photographs would look odd from a two-edged perspective. Because you have two eyes. When they're looking at the same thing, your brain can merge the image. When you're focusing on something different, you're actually seeing two different images. Exactly the same way that we can perceive depth with our vision, with parallax. Notice that when you look at something close to your face, you go cross-eyed. Your brain measures the difference in angle between your eyes. If they are pointing straight forward, then your brain knows the object is far away, if your eyes are crossed (like you're looking at your nose) then there is a large difference in angle between your eyes. Now imagine scaling this up. your eyes are only a few inches apart, and can only perceive short distances. To measure long distances like stars, we take picture of it, wait 6 months, then take another picture of it. By this point we have rotated around to the other side of the sun. these two positions are the equivalent of your two eyes. By comparing the position of the star in the pictures, we can find the difference in angle between the two vantage points the same way your brain does. For more read the wikipedia article on parallax. [0.5408486127853394, 0.5914487838745117, 0.4706460237503052, 0.4665601849555969]
    How do snails get their shells without being near the ocean? Shells don't simply come from the sea, they're grown by secreting their material. Snails grow their own shells. There is a sea snail that does this: URL_0 It's because of air bouncing around. Think about how you can hear the wind because it moves things around and even if there's nothing to move, you can hear it whistling past buildings or thumping into walls. Well, even a tiny breeze makes noise but usually it's so quiet we can't hear it or don't notice. That's where the shell comes in. The shape of the shell magnifies the sound of a tiny breeze to the point where it's loud enough to hear. So when you put your ear to a shell, you're not hearing the ocean, you're hearing tiny breezes moving in the shell. If i recall correctly, snails are super sensitive to touch and have relatively many pain/heat/cold receptors in thier skin. [0.498150110244751, 0.3957440257072449, 0.38127124309539795, 0.3972085416316986]
    How is the diameter of Mars 53% the size of Earth's but the surface area in only 38% of Earth's? The surface area is actually only 28% of Earth's. The forumula for the surface area of a sphere is 4 * pi * r². So if r = .53, r² = .28 Mars has a mean density of 3.933 g/cm compared to 5.514 g/cm3 for Earth. The radius of Mars at 2,106 mi is slightly over half of Earth's radius 3,959 mi. Thus the ratios of their volumes is 2106^3 / 3959^3 = 0.15. So since Mars is about 30% less dense than Earth and has 15% of the volume, it has about 11% of the mass. Mars The large canyon, Valles Marineris (Latin for "Mariner Valleys", also known as Agathadaemon in the old canal maps), has a length of 4,000 km (2,500 mi) and a depth of up to 7 km (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 km (277 mi) long and nearly 2 km (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 km (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement.[132][133] Timekeeping on Mars The average length of a Martian sidereal day is 24 h 37 m 22.663 s (88,642.66300 seconds based on SI units), and the length of its solar day (often called a sol) is 24 h 39 m 35.244147 s (88,775.244147 seconds). The corresponding values for Earth are 23 h 56 m 4.0916 s and 24h 00 m 00.002 s, respectively. This yields a conversion factor of 1.02749125170 days/sol. Thus Mars' solar day is only about 2.7% longer than Earth's. [0.4725932478904724, 0.47929924726486206, 0.42970356345176697, 0.37027764320373535]
  • Loss: MarginMSELoss

Evaluation Dataset

sts_retrieval_dataset_hn_scored

  • Dataset: sts_retrieval_dataset_hn_scored at 38e5944
  • Size: 7,447 evaluation samples
  • Columns: sentence1, sentence2, sentence3, sentence4, sentence5, and label
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 sentence3 sentence4 sentence5 label
    type string string string string string list
    details
    • min: 6 tokens
    • mean: 20.66 tokens
    • max: 66 tokens
    • min: 6 tokens
    • mean: 112.32 tokens
    • max: 414 tokens
    • min: 6 tokens
    • mean: 108.45 tokens
    • max: 406 tokens
    • min: 6 tokens
    • mean: 74.89 tokens
    • max: 455 tokens
    • min: 9 tokens
    • mean: 66.3 tokens
    • max: 455 tokens
    • size: 4 elements
  • Samples:
    sentence1 sentence2 sentence3 sentence4 sentence5 label
    If you're not clinically dead until your heart stops, does that mean all heart donors have come from people who were technically still alive? When determining death, the critical organ is the brain, not the heart, because the brain defines the person (character, memories, ability to do anything at all) and most damage to the brain is irreparable. So when there's massive brain damage, all that defines the person is effectively gone forever. However, when somebody is "brain dead", that person's body can still be kept biologically alive for long periods with mechanical ventilation etc. and is a perfect organ donor. Now, the difficulty of course lies in defining "brain death", but there definitely can be unambiguous cases, e.g. when someone sustains massive physical damage to the brain from a traffic accident or similar. To my understanding of developmental biology, the organ will always be composed of cells from the donor. The cells that would divide in your heart to replace old/dead/damaged cells would come through the mitosis of other cells also within the heart as opposed to the liver or the immune system. > The real question to me is, will re recipient still have genetic material of me inside him when he passes away? Yes, for as long as they live with that organ. The balance will be taken from the estate of the deceased. If the estate does not have enough to cover the balance, the debt is written off as a loss. I have heard anecdotes of debt collectors chasing after descendants for the debt, but unless they have signed as legally obligated to paying it, there is no legal standing for them to repay it. Blood type In transfusions of packed red blood cells, individuals with type O Rh D negative blood are often called universal donors. Those with type AB Rh D positive blood are called universal recipients. However, these terms are only generally true with respect to possible reactions of the recipient's anti-A and anti-B antibodies to transfused red blood cells, and also possible sensitization to Rh D antigens. One exception is individuals with hh antigen system (also known as the Bombay phenotype) who can only receive blood safely from other hh donors, because they form antibodies against the H antigen present on all red blood cells.[30][31] [0.41563937067985535, 0.4245889186859131, 0.2968587279319763, 0.3197227418422699]
    How Do We Know That Ammonites Had External Shells? A couple of things; first, one must not forget that the closest phylogenic lineage to the ammonites is the Nautilidae, and they have external shells to this day. Furthermore, with the exception of the complexity of the septae, the use of the shell appears homologous between the 2 lineages. Second, there have been a few instances where there was preservation of soft parts in some ammonite fossils, those specimens showed no significant soft tissue on the outside of the shell. In gastropods, the shell is secreted by a part of the molluscan body known as the mantle. URL_0 They also plot and combine measurements of geological structures in order to better understand the orientations of faults and folds in order to reconstruct the history of rock deformation in the area. Even older rocks, such as the Acasta gneiss of the Slave craton in northwestern Canada, the oldest known rock in the world have been metamorphosed to the point where their origin is undiscernable without laboratory analysis. [0.45681333541870117, 0.26675254106521606, 0.23898926377296448, 0.2671601176261902]
    Does a dead animal taste different than an alive animal? It depends on where the piece you want to eat is located. An hour is enough for livor mortis to take place (movement of blood based on gravity). Based on this, you might get a very bloody piece or a 'dry' one. Other factors come into play so it's a definite yes in terms of taste difference. Note, that the difference might be very subtle in some animals. Dead things can make you sick or kill you. Dead things may be full of harmful bacteria or they may have died from something that could kill you also. Carrion eaters have evolved to be resistant to these sorts of things. It is a specialization. They are overlaid with colourless music played live , and consist of vague , almost deliberately sparse orchestral sounds . Yes. It is natural and Yes it is influenced. Imho. Slap stick Chevy Chase / Chris Farley stuff will always hit you in the basic physical comedy bone. Other things that are context dependant of course depend on you understanding the content. How is a spice girl joke funny if you don't know who they are? [0.4751628339290619, 0.39276906847953796, 0.2803378701210022, 0.2605991065502167]
  • Loss: MarginMSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • gradient_accumulation_steps: 4
  • learning_rate: 0.0001
  • weight_decay: 0.04
  • num_train_epochs: 0.15
  • warmup_steps: 0.15
  • fp16: True

All Hyperparameters

Click to expand
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 128
  • per_device_eval_batch_size: 128
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 0.0001
  • weight_decay: 0.04
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 0.15
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: None
  • warmup_ratio: None
  • warmup_steps: 0.15
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • enable_jit_checkpoint: False
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • use_cpu: False
  • seed: 42
  • data_seed: None
  • bf16: False
  • fp16: True
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: -1
  • ddp_backend: None
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • auto_find_batch_size: False
  • full_determinism: False
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • use_cache: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss Validation Loss
0.0731 50 1226.4617 -
0.1463 100 83.9058 -
0.1506 103 - 24.1453

Framework Versions

  • Python: 3.12.13
  • Sentence Transformers: 5.3.0
  • Transformers: 5.0.0
  • PyTorch: 2.10.0+cu128
  • Accelerate: 1.13.0
  • Datasets: 4.0.0
  • Tokenizers: 0.22.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MarginMSELoss

@misc{hofstätter2021improving,
    title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
    author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
    year={2021},
    eprint={2010.02666},
    archivePrefix={arXiv},
    primaryClass={cs.IR}
}