--- license: apache-2.0 language: - en - zh tags: - zen - zenlm - hanzo-ai - zen-3 pipeline_tag: text-generation library_name: transformers base_model: zenlm/zen-pro --- # Zen 3 > **Parameters**: 8B | **Architecture**: Zen 3 Architecture | **Context**: 32K | **License**: Apache 2.0 | **Released**: 2024-09-15 Zen 3 generation label. Weights at [zenlm/zen-pro](https://huggingface.co/zenlm/zen-pro). The Zen 3 family (Q3–Q4 2024) introduced sparse MoE routing and expanded to vision, audio, and multimodal reasoning. ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("zenlm/zen-pro", torch_dtype="auto") tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-pro") ``` ```bash ollama run hf.co/zenlm/zen-pro ``` --- ## The Zen LM Family Joint research collaboration: - **Hanzo AI** (Techstars '17) — AI infrastructure, API gateway, inference optimization - **Zoo Labs Foundation** (501c3) — Open AI research, ZIPs governance, decentralized training - **Lux Partners Limited** — Compute coordination and settlement layer All weights Apache 2.0. Download, run locally, fine-tune, deploy commercially. [HuggingFace](https://huggingface.co/zenlm) · [Chat free](https://hanzo.chat) · [API](https://api.hanzo.ai) · [Docs](https://zenlm.org)