my-embedding-gemma / README.md
blachang28's picture
Add new SentenceTransformer model
99fe6cb verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:2680
  - loss:MultipleNegativesRankingLoss
base_model: google/embeddinggemma-300m
widget:
  - source_sentence: >-
      Let $A, M,$ and $C$ be nonnegative integers such that $A + M + C = 12$.
      What is the maximum value of $A \cdot M \cdot C + A \cdot M + M \cdot C +
      A \cdot C$?
    sentences:
      - >-
        Given that $2^{2004}$ is a $604$-digit number whose first digit is $1$,
        how many elements of the set $S = \{2^0,2^1,2^2,\ldots ,2^{2003}\}$ have
        a first digit of $4$?
      - >-
        To complete the grid below, each of the digits 1 through 4 must occur
        once in each row and once in each column. What number will occupy the
        lower right-hand square? \[\begin{tabular}{|c|c|c|c|}\hline 1 & & 2 &\\
        \hline 2 & 3 & &\\ \hline & &&4\\ \hline & &&\\ \hline\end{tabular}\]
      - >-
        Two non-zero real numbers, $a$ and $b,$ satisfy $ab = a - b$. Which of
        the following is a possible value of $\frac {a}{b} + \frac {b}{a} - ab$?
  - source_sentence: What is the sum of the prime factors of $2010$?
    sentences:
      - >-
        The lengths of the sides of a triangle in inches are three consecutive
        integers. The length of the shortest side is $30\%$ of the perimeter.
        What is the length of the longest side?
      - >-
        On a map, a $12$-centimeter length represents $72$ kilometers. How many
        kilometers does a $17$-centimeter length represent?
      - >-
        The five pieces shown below can be arranged to form four of the five
        figures shown in the choices. Which figure cannot be formed? [asy]
        defaultpen(linewidth(0.6)); size(80); real r=0.5, s=1.5; path
        p=origin--(1,0)--(1,1)--(0,1)--cycle; draw(p); draw(shift(s,r)*p);
        draw(shift(s,-r)*p); draw(shift(2s,2r)*p); draw(shift(2s,0)*p);
        draw(shift(2s,-2r)*p); draw(shift(3s,3r)*p); draw(shift(3s,-3r)*p);
        draw(shift(3s,r)*p); draw(shift(3s,-r)*p); draw(shift(4s,-4r)*p);
        draw(shift(4s,-2r)*p); draw(shift(4s,0)*p); draw(shift(4s,2r)*p);
        draw(shift(4s,4r)*p); [/asy] [asy] size(350);
        defaultpen(linewidth(0.6)); path p=origin--(1,0)--(1,1)--(0,1)--cycle;
        pair[] a={(0,0), (0,1), (0,2), (0,3), (0,4), (1,0), (1,1), (1,2), (2,0),
        (2,1), (3,0), (3,1), (3,2), (3,3), (3,4)}; pair[] b={(5,3), (5,4),
        (6,2), (6,3), (6,4), (7,1), (7,2), (7,3), (7,4), (8,0), (8,1), (8,2),
        (9,0), (9,1), (9,2)}; pair[] c={(11,0), (11,1), (11,2), (11,3), (11,4),
        (12,1), (12,2), (12,3), (12,4), (13,2), (13,3), (13,4), (14,3), (14,4),
        (15,4)}; pair[] d={(17,0), (17,1), (17,2), (17,3), (17,4), (18,0),
        (18,1), (18,2), (18,3), (18,4), (19,0), (19,1), (19,2), (19,3), (19,4)};
        pair[] e={(21,4), (22,1), (22,2), (22,3), (22,4), (23,0), (23,1),
        (23,2), (23,3), (23,4), (24,1), (24,2), (24,3), (24,4), (25,4)}; int i;
        for(int i=0; i<15; i=i+1) { draw(shift(a[i])*p); draw(shift(b[i])*p);
        draw(shift(c[i])*p); draw(shift(d[i])*p); draw(shift(e[i])*p); } [/asy]
        \[
  - source_sentence: >-
      A circle and two distinct lines are drawn on a sheet of paper. What is the
      largest possible number of points of intersection of these figures?
    sentences:
      - >-
        Three fair six-sided dice are rolled. What is the probability that the
        values shown on two of the dice sum to the value shown on the remaining
        die?
      - >-
        In the small country of Mathland, all automobile license plates have
        four symbols. The first must be a vowel (A, E, I, O, or U), the second
        and third must be two different letters among the 21 non-vowels, and the
        fourth must be a digit (0 through 9). If the symbols are chosen at
        random subject to these conditions, what is the probability that the
        plate will read "AMC8"?
      - >-
        How many different combinations of \$5 bills and \$2 bills can be used
        to make a total of \$17? Order does not matter in this problem.
  - source_sentence: >-
      Points $K, L, M,$ and $N$ lie in the plane of the square $ABCD$ such that
      $AKB$, $BLC$, $CMD$, and $DNA$ are equilateral triangles. If $ABCD$ has an
      area of 16, find the area of $KLMN$. [asy] unitsize(2cm);
      defaultpen(fontsize(8)+linewidth(0.8)); pair A=(-0.5,0.5), B=(0.5,0.5),
      C=(0.5,-0.5), D=(-0.5,-0.5); pair K=(0,1.366), L=(1.366,0), M=(0,-1.366),
      N=(-1.366,0); draw(A--N--K--A--B--K--L--B--C--L--M--C--D--M--N--D--A);
      label("$A$",A,SE); label("$B$",B,SW); label("$C$",C,NW);
      label("$D$",D,NE); label("$K$",K,NNW); label("$L$",L,E); label("$M$",M,S);
      label("$N$",N,W); [/asy]
    sentences:
      - >-
        A semicircle of diameter $1$ sits at the top of a semicircle of diameter
        $2$, as shown. The shaded area inside the smaller semicircle and outside
        the larger semicircle is called a lune. Determine the area of this lune.
        [asy] import graph; size(150); defaultpen(fontsize(8)); pair A=(-2,0),
        B=(2,0); filldraw(Arc((0,sqrt(3)),1,0,180)--cycle,mediumgray);
        filldraw(Arc((0,0),2,0,180)--cycle,white);
        draw(2*expi(2*pi/6)--2*expi(4*pi/6)); label("1",(0,sqrt(3)),(0,-1));
        label("2",(0,0),(0,-1)); [/asy]
      - >-
        The average age of $5$ people in a room is $30$ years. An $18$-year-old
        person leaves the room. What is the average age of the four remaining
        people?
      - Which of the following numbers is a perfect square?
  - source_sentence: >-
      The harmonic mean of a set of non-zero numbers is the reciprocal of the
      average of the reciprocals of the numbers. What is the harmonic mean of 1,
      2, and 4?
    sentences:
      - >-
        Spinners $A$ and $B$ are spun. On each spinner, the arrow is equally
        likely to land on each number. What is the probability that the product
        of the two spinners' numbers is even?
      - >-
        Abby, Bridget, and four of their classmates will be seated in two rows
        of three for a group picture, as shown. \begin{eqnarray*}
        \text{X}&\quad\text{X}\quad&\text{X} \\
        \text{X}&\quad\text{X}\quad&\text{X} \end{eqnarray*} If the seating
        positions are assigned randomly, what is the probability that Abby and
        Bridget are adjacent to each other in the same row or the same column?
      - >-
        Semicircle $\Gamma$ has diameter $\overline{AB}$ of length $14$. Circle
        $\Omega$ lies tangent to $\overline{AB}$ at a point $P$ and intersects
        $\Gamma$ at points $Q$ and $R$. If $QR=3\sqrt3$ and $\angle
        QPR=60^\circ$, then the area of $\triangle PQR$ equals
        $\tfrac{a\sqrt{b}}{c}$, where $a$ and $c$ are relatively prime positive
        integers, and $b$ is a positive integer not divisible by the square of
        any prime. What is $a+b+c$?
pipeline_tag: sentence-similarity
library_name: sentence-transformers

SentenceTransformer based on google/embeddinggemma-300m

This is a sentence-transformers model finetuned from google/embeddinggemma-300m. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: google/embeddinggemma-300m
  • Maximum Sequence Length: 2048 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 2048, 'do_lower_case': False, 'architecture': 'Gemma3TextModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Dense({'in_features': 768, 'out_features': 3072, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
  (3): Dense({'in_features': 3072, 'out_features': 768, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
  (4): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("blachang28/my-embedding-gemma")
# Run inference
queries = [
    "The harmonic mean of a set of non-zero numbers is the reciprocal of the average of the reciprocals of the numbers. What is the harmonic mean of 1, 2, and 4?",
]
documents = [
    'Abby, Bridget, and four of their classmates will be seated in two rows of three for a group picture, as shown. \\begin{eqnarray*} \\text{X}&\\quad\\text{X}\\quad&\\text{X} \\\\ \\text{X}&\\quad\\text{X}\\quad&\\text{X} \\end{eqnarray*} If the seating positions are assigned randomly, what is the probability that Abby and Bridget are adjacent to each other in the same row or the same column?',
    'Semicircle $\\Gamma$ has diameter $\\overline{AB}$ of length $14$. Circle $\\Omega$ lies tangent to $\\overline{AB}$ at a point $P$ and intersects $\\Gamma$ at points $Q$ and $R$. If $QR=3\\sqrt3$ and $\\angle QPR=60^\\circ$, then the area of $\\triangle PQR$ equals $\\tfrac{a\\sqrt{b}}{c}$, where $a$ and $c$ are relatively prime positive integers, and $b$ is a positive integer not divisible by the square of any prime. What is $a+b+c$?',
    "Spinners $A$ and $B$ are spun. On each spinner, the arrow is equally likely to land on each number. What is the probability that the product of the two spinners' numbers is even?",
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 768] [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[ 0.9314, -0.3410,  0.9672]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 2,680 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 10 tokens
    • mean: 82.06 tokens
    • max: 1260 tokens
    • min: 11 tokens
    • mean: 80.7 tokens
    • max: 1260 tokens
    • min: 12 tokens
    • mean: 92.86 tokens
    • max: 2048 tokens
  • Samples:
    anchor positive negative
    $(6?3) + 4 - (2 - 1) = 5.$ To make this statement true, the question mark between the 6 and the 3 should be replaced by What is the degree measure of the smaller angle formed by the hands of a clock at 10 o'clock? An insect lives on the surface of a regular tetrahedron with edges of length 1. It wishes to travel on the surface of the tetrahedron from the midpoint of one edge to the midpoint of the opposite edge. What is the length of the shortest such trip? (Note: Two edges of a tetrahedron are opposite if they have no common endpoint.)
    What is the degree measure of the smaller angle formed by the hands of a clock at 10 o'clock? Which triplet of numbers has a sum NOT equal to 1? Corners are sliced off a unit cube so that the six faces each become regular octagons. What is the total volume of the removed tetrahedra?
    Which triplet of numbers has a sum NOT equal to 1? What is the degree measure of the smaller angle formed by the hands of a clock at 10 o'clock? How many pairs of positive integers $(a,b)$ are there such that $\text{gcd}(a,b)=1$ and $\frac{a}{b} + \frac{14b}{9a}$ is an integer? $\mathrm {(A)}\ 4\quad\mathrm {(B)}\ 6\quad\mathrm {(C)}\ 9\quad\mathrm {(D)}\ 12\quad\mathrm {(E)}\ \text{infinitely many}$
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim",
        "gather_across_devices": false
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 1
  • learning_rate: 2e-05
  • num_train_epochs: 5
  • warmup_ratio: 0.1
  • prompts: task: classification | query:

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 1
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: task: classification | query:
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss
1.0 2680 1.5631
2.0 5360 1.2027
3.0 8040 0.8526
4.0 10720 0.6227
5.0 13400 0.3352

Framework Versions

  • Python: 3.12.12
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.2
  • PyTorch: 2.9.0+cu126
  • Accelerate: 1.12.0
  • Datasets: 4.0.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}