fix: Zen MoDE = Mixture of Diverse Experts
Browse files
README.md
CHANGED
|
@@ -14,22 +14,19 @@ library_name: transformers
|
|
| 14 |
|
| 15 |
# Zen 5
|
| 16 |
|
| 17 |
-
> **Parameters**: TBA | **Architecture**: Zen MoDE | **Context**: 1M | **Status**: In training
|
| 18 |
|
| 19 |
-
Zen 5 — next-generation flagship.
|
| 20 |
|
| 21 |
-
**Zen MoDE**
|
| 22 |
|
| 23 |
-
[
|
| 24 |
|
| 25 |
---
|
| 26 |
## The Zen LM Family
|
| 27 |
|
| 28 |
-
Joint research
|
| 29 |
-
- **Hanzo AI** (Techstars '17) — AI infrastructure, API gateway, inference optimization
|
| 30 |
-
- **Zoo Labs Foundation** (501c3) — Open AI research, ZIPs governance, decentralized training
|
| 31 |
-
- **Lux Partners Limited** — Compute coordination and settlement layer
|
| 32 |
|
| 33 |
All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.
|
| 34 |
|
| 35 |
-
[HuggingFace](https://huggingface.co/zenlm) · [Chat
|
|
|
|
| 14 |
|
| 15 |
# Zen 5
|
| 16 |
|
| 17 |
+
> **Parameters**: TBA | **Architecture**: Zen MoDE (Mixture of Diverse Experts) | **Context**: 1M | **Status**: In training
|
| 18 |
|
| 19 |
+
Zen 5 — next-generation flagship. Currently in training.
|
| 20 |
|
| 21 |
+
**Zen MoDE** is our next-generation architecture: Mixture of Diverse Experts with sparse activation, extended context, and enhanced multi-step reasoning. First introduced with Zen 5.
|
| 22 |
|
| 23 |
+
[Learn more → zenlm.org](https://zenlm.org)
|
| 24 |
|
| 25 |
---
|
| 26 |
## The Zen LM Family
|
| 27 |
|
| 28 |
+
Joint research between **Hanzo AI** (Techstars '17), **Zoo Labs Foundation** (501c3), and **Lux Partners Limited**.
|
|
|
|
|
|
|
|
|
|
| 29 |
|
| 30 |
All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially.
|
| 31 |
|
| 32 |
+
[HuggingFace](https://huggingface.co/zenlm) · [Chat](https://hanzo.chat) · [API](https://api.hanzo.ai) · [Docs](https://zenlm.org)
|