File size: 1,364 Bytes
16e8d2e f0a491d 16e8d2e | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 | ---
license: mit
language:
- en
- zh
tags:
- zen4
- zenlm
- hanzo
- frontier-ai
base_model: MiniMaxAI/MiniMax-M1-80k
pipeline_tag: text-generation
library_name: transformers
---
# Zen4 Storm
**Zen4 Storm** is a 456B MoE (45B active) parameter language model from the [Zen4 family](https://zenlm.org) by [Zen LM](https://huggingface.co/zenlm) and [Hanzo AI](https://hanzo.ai).
Hybrid MoE with Lightning Attention for ultra-long context reasoning.
## Model Details
| Property | Value |
|----------|-------|
| **Parameters** | 456B MoE total, 45B active |
| **Architecture** | Zen4 Frontier |
| **Context** | 1M tokens |
| **License** | MIT |
| **Family** | Zen4 |
| **Tier** | Frontier |
| **Creator** | Zen LM / Hanzo AI |
## Weights
> Weights hosted at [MiniMaxAI/MiniMax-M1-80k](https://huggingface.co/MiniMaxAI/MiniMax-M1-80k) due to storage constraints.
> Use the source repository for inference and fine-tuning.
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("MiniMaxAI/MiniMax-M1-80k", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("MiniMaxAI/MiniMax-M1-80k")
```
## Links
- [Zen LM](https://zenlm.org) | [Hanzo AI](https://hanzo.ai) | [Hanzo Chat](https://hanzo.chat)
- [All Zen Models](https://huggingface.co/zenlm)
---
*Zen AI: Clarity Through Intelligence*
|