Zen4 Ultra Max
Zen4 Ultra Max is a 1T MoE (50B active) parameter language model from the Zen4 family by Zen LM and Hanzo AI.
Trillion-parameter FP8-trained MoE — largest open frontier model.
Model Details
| Property | Value |
|---|---|
| Parameters | 1T MoE total, 50B active |
| Architecture | Frontier MoE |
| Context | 128K tokens |
| License | MIT |
| Family | Zen4 |
| Tier | Frontier |
| Creator | Zen LM / Hanzo AI |
Weights
Weights hosted at inclusionAI/Ling-1T due to storage constraints. Use the source repository for inference and fine-tuning.
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("inclusionAI/Ling-1T", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("inclusionAI/Ling-1T")
Links
Zen AI: Clarity Through Intelligence
Model tree for zenlm/zen4-ultra-max
Base model
inclusionAI/Ling-1T