Lemma
Collection
A lemma is "something assumed" — an intermediate theorem on the path to a larger proof, or a heading that signals the subject of what follows. • 18 items • Updated
LEK-aligned Gemma 4 E2B, bf16 reference weights, in HuggingFace safetensors layout.
Converted from lthn/lemer-mlx-bf16 for use on non-MLX platforms (NVIDIA/AMD GPU, Kaggle TPU, vanilla transformers).
| Source | This repo |
|---|---|
lthn/lemer-mlx-bf16 (MLX format) |
lthn/lemer-hf-bf16 (HF safetensors) |
language_model.model.* keys |
model.language_model.* keys |
| MLX conv layout (C,K,I) / (O,H,W,I) | PyTorch layout (C,I,K) / (O,I,H,W) |
Weights are byte-equivalent to lemer-mlx-bf16 after the key rename + conv permutation — identical LEK alignment, identical behaviour.
from transformers import AutoModelForCausalLM, AutoTokenizer
tok = AutoTokenizer.from_pretrained("lthn/lemer-hf-bf16")
model = AutoModelForCausalLM.from_pretrained(
"lthn/lemer-hf-bf16",
dtype="bfloat16",
device_map="auto",
)
EUPL-1.2.
Base model
google/gemma-4-E2B