Distilled Backbone: embeddinggemma-300m-distilled-10pct-128dim

This is a distilled/compressed version of google/embeddinggemma-300m.

Compression Details

  • Base model: google/embeddinggemma-300m
  • Width reduction factor: 0.1
  • Target hidden size: 72
  • Final embedding dimension: 128
  • Had projection layer: True
  • Projection incorporated: True

Usage

To use this as a pretrained backbone in validation:

python -m playground.validate_from_checkpoint \
    --pretrained_backbone_model "Pieces/embeddinggemma-300m-distilled-width10pct-128dim-best" \
    --backbone_embedding_dim 128 \
    --backbone_pooling_mode "mean" \
    --num_samples 100 \
    --dataset_split "val" \
    --num_general_tags_high 100 \
    --num_general_tags_low 100

Important: The path must point to the sentence_transformer subdirectory (not the parent directory).

The exported model has been configured to output embeddings of dimension 128 directly, incorporating any projection layers from the distillation process.

Downloads last month
80
Safetensors
Model size
19.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Pieces/embeddinggemma-300m-distilled-width10pct-128dim-best

Finetuned
(223)
this model