Wire-2M (H16 L8, Chinchilla)

WireNative 2M โ€” Chinchilla-scaled training run, best BPB 1.97

Part of the Harmonic GPT research into oscillator-based neural computation.

Architecture: WireNative

Property Value
Parameters 2,326,392
BPB 1.9697
Training step 0
n_harmonics 32
n_layers 8
n_groups 7
d_model 448
Vocab 256 (raw bytes)

Usage

from huggingface_hub import hf_hub_download
from safetensors.torch import load_file

weights = load_file(hf_hub_download("MonumentalSystems/wire-2m-chinchilla", "model.safetensors"))
config = json.load(open(hf_hub_download("MonumentalSystems/wire-2m-chinchilla", "config.json")))

All operations are native Clifford algebra / harmonic oscillator dynamics โ€” no softmax attention, no MLP, no ReLU.

Downloads last month
263
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support