File size: 1,340 Bytes
f52927f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
624a230
f52927f
 
 
624a230
f52927f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
license: apache-2.0
language:
- en
- zh
tags:
- zen
- zenlm
- hanzo-ai
- zen-3
pipeline_tag: text-generation
library_name: transformers
base_model: zenlm/zen-coder
---

# Zen 3 Coder

> **Parameters**: 32B | **Architecture**: Zen 3 Architecture | **Context**: 131K | **License**: Apache 2.0 | **Released**: 2024-11-01

Zen 3 generation label. Weights at [zenlm/zen-coder](https://huggingface.co/zenlm/zen-coder).

The Zen 3 family (Q3–Q4 2024) introduced sparse MoE routing and expanded to vision, audio, and multimodal reasoning.

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("zenlm/zen-coder", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-coder")
```

```bash
ollama run hf.co/zenlm/zen-coder
```

---
## The Zen LM Family

Joint research collaboration:
- **Hanzo AI** (Techstars '17) — AI infrastructure, API gateway, inference optimization
- **Zoo Labs Foundation** (501c3) — Open AI research, ZIPs governance, decentralized training
- **Lux Partners Limited** — Compute coordination and settlement layer

All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.

[HuggingFace](https://huggingface.co/zenlm) · [Chat free](https://hanzo.chat) · [API](https://api.hanzo.ai) · [Docs](https://zenlm.org)