Zen4 1M

Parameters: 80B MoE | Architecture: Zen 4 Architecture | Context: 1M | License: Apache 2.0 | Released: 2025-05-01

1M token context — entire codebases, books, research docs in a single pass.

Base weights: zenlm/zen4-max

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("zenlm/zen4-max", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen4-max")

The Zen LM Family

Joint research collaboration:

  • Hanzo AI (Techstars '17) — AI infrastructure, API gateway, inference optimization
  • Zoo Labs Foundation (501c3) — Open AI research, ZIPs governance, decentralized training
  • Lux Partners Limited — Compute coordination and settlement layer

All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.

HuggingFace · Chat free · API · Docs

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zenlm/zen4-1m

Base model

zenlm/zen4-max
Finetuned
(1)
this model