zen4-max / README.md
zeekay's picture
fix: remove MoDE references (MoDE is zen5 only)
3a5dc5d verified
metadata
license: apache-2.0
language:
  - en
  - zh
  - ja
  - ko
  - fr
  - de
  - es
  - pt
  - ru
  - ar
tags:
  - zen4
  - zenlm
  - hanzo
  - frontier-ai
  - abliterated
base_model: huihui-ai/Huihui-Qwen3.5-35B-A3B-abliterated
pipeline_tag: text-generation
library_name: transformers

Zen4 Max

Zen4 Max is a 35B MoE (3B active) parameter language model from the Zen4 family by Zen LM and Hanzo AI.

Built on abliterated (uncensored) weights with Zen4 Frontier architecture for unrestricted, open-ended AI assistance.

Model Details

Property Value
Parameters 35B MoE total, 3B active
Architecture Zen4 Frontier
Context 262K tokens
License APACHE-2.0
Family Zen4
Tier Medium
Creator Zen LM / Hanzo AI

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("zenlm/zen4-max", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen4-max")

messages = [{"role": "user", "content": "Hello, who are you?"}]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0][inputs.input_ids.shape[-1]:], skip_special_tokens=True))

Zen4 Family

Model Parameters Context HuggingFace
Zen4 Nano 0.8B 262K zenlm/zen4-nano
Zen4 Micro 2B 262K zenlm/zen4-micro
Zen4 Mini 4B 262K zenlm/zen4-mini
Zen4 9B 262K zenlm/zen4
Zen4 Pro 27B 262K zenlm/zen4-pro
Zen4 Max 35B MoE (3B active) 262K zenlm/zen4-max
Zen4 Coder Flash 31B MoE (3B active) 131K zenlm/zen4-coder-flash
Zen4 Pro Max 80B MoE (3B active) 256K zenlm/zen4-pro-max
Zen4 Coder 80B MoE (3B active) 256K zenlm/zen4-coder
Zen4 Mega 122B MoE (10B active) 262K zenlm/zen4-mega
Zen4 Thunder 230B MoE (10B active) 1M zenlm/zen4-thunder
Zen4 Storm 456B MoE (45B active) 1M zenlm/zen4-storm
Zen4 Titan 744B MoE (40B active) 128K zenlm/zen4-titan
Zen4 Ultra 1.04T MoE (32B active) 256K zenlm/zen4-ultra
Zen4 Ultra Max 1T MoE (50B active) 128K zenlm/zen4-ultra-max

Links


Zen AI: Clarity Through Intelligence