zen4-storm / README.md
zeekay's picture
fix: remove MoDE references (MoDE is zen5 only)
f0a491d verified
metadata
license: mit
language:
  - en
  - zh
tags:
  - zen4
  - zenlm
  - hanzo
  - frontier-ai
base_model: MiniMaxAI/MiniMax-M1-80k
pipeline_tag: text-generation
library_name: transformers

Zen4 Storm

Zen4 Storm is a 456B MoE (45B active) parameter language model from the Zen4 family by Zen LM and Hanzo AI.

Hybrid MoE with Lightning Attention for ultra-long context reasoning.

Model Details

Property Value
Parameters 456B MoE total, 45B active
Architecture Zen4 Frontier
Context 1M tokens
License MIT
Family Zen4
Tier Frontier
Creator Zen LM / Hanzo AI

Weights

Weights hosted at MiniMaxAI/MiniMax-M1-80k due to storage constraints. Use the source repository for inference and fine-tuning.

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("MiniMaxAI/MiniMax-M1-80k", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("MiniMaxAI/MiniMax-M1-80k")

Links


Zen AI: Clarity Through Intelligence