Zen4 1M
Parameters: 80B MoE | Architecture: Zen 4 Architecture | Context: 1M | License: Apache 2.0 | Released: 2025-05-01
1M token context — entire codebases, books, research docs in a single pass.
Base weights: zenlm/zen4-max
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("zenlm/zen4-max", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen4-max")
The Zen LM Family
Joint research collaboration:
- Hanzo AI (Techstars '17) — AI infrastructure, API gateway, inference optimization
- Zoo Labs Foundation (501c3) — Open AI research, ZIPs governance, decentralized training
- Lux Partners Limited — Compute coordination and settlement layer
All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.
HuggingFace · Chat free · API · Docs
Model tree for zenlm/zen4-1m
Base model
zenlm/zen4-max