Zen 3 Max Instruct

Parameters: 235B | Architecture: Zen 3 Architecture | Context: 131K | License: Apache 2.0 | Released: 2024-10-15

Zen 3 generation label. Weights at zenlm/zen-designer-235b-a22b-instruct.

The Zen 3 family (Q3–Q4 2024) introduced sparse MoE routing and expanded to vision, audio, and multimodal reasoning.

from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("zenlm/zen-designer-235b-a22b-instruct", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-designer-235b-a22b-instruct")
ollama run hf.co/zenlm/zen-designer-235b-a22b-instruct

The Zen LM Family

Joint research collaboration:

  • Hanzo AI (Techstars '17) — AI infrastructure, API gateway, inference optimization
  • Zoo Labs Foundation (501c3) — Open AI research, ZIPs governance, decentralized training
  • Lux Partners Limited — Compute coordination and settlement layer

All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.

HuggingFace · Chat free · API · Docs

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for zenlm/zen-3-max-instruct

Unable to build the model tree, the base model loops to the model itself. Learn more.