Gemma 3 27B PT – Orbax/TensorStore (OCDBT)
This repository contains the Orbax/TensorStore (OCDBT) checkpoint converted from
google/gemma-3-27b-pt Hugging Face safetensors. The conversion stacks transformer
layer weights along the depth axis for efficient JAX/Orbax loading.
Conversion script can be found at on GitHub: https://github.com/dominik3141/hf-to-orbax
What’s included
- Orbax/TensorStore checkpoint files (OCDBT)
LICENSEandNOTICEper Gemma Terms of Use
Conversion details
- Source:
google/gemma-3-27b-pt - Format change only: no weight changes beyond layout/format
- Layer weights use the same HF key with the numeric layer index removed and are stacked on axis 0.
Example:
model.layers.12.self_attn.q_proj.weight→model.layers.self_attn.q_proj.weight
Loading example (JAX/Orbax)
from huggingface_hub import snapshot_download
import orbax.checkpoint as ocp
path = snapshot_download("dffarr/gemma-3-27b-pt-orbax")
ckpt = ocp.StandardCheckpointer().restore(path)
# Example access
embed_tokens = ckpt["model.embed_tokens.weight"]
q_proj = ckpt["model.layers.self_attn.q_proj.weight"] # stacked: [num_layers, ...]
License & Use
This repository is provided under the Gemma Terms of Use. Please read LICENSE
and comply with the Gemma Prohibited Use Policy:
https://ai.google.dev/gemma/terms
Conversion script
The conversion script used to generate this checkpoint is included as convert.py.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for dffarr/gemma-3-27b-pt-orbax
Base model
google/gemma-3-27b-pt