| license: mit | |
| language: | |
| - en | |
| - zh | |
| tags: | |
| - zen4 | |
| - zenlm | |
| - hanzo | |
| - frontier-ai | |
| base_model: MiniMaxAI/MiniMax-M1-80k | |
| pipeline_tag: text-generation | |
| library_name: transformers | |
| # Zen4 Storm | |
| **Zen4 Storm** is a 456B MoE (45B active) parameter language model from the [Zen4 family](https://zenlm.org) by [Zen LM](https://huggingface.co/zenlm) and [Hanzo AI](https://hanzo.ai). | |
| Hybrid MoE with Lightning Attention for ultra-long context reasoning. | |
| ## Model Details | |
| | Property | Value | | |
| |----------|-------| | |
| | **Parameters** | 456B MoE total, 45B active | | |
| | **Architecture** | Zen4 Frontier | | |
| | **Context** | 1M tokens | | |
| | **License** | MIT | | |
| | **Family** | Zen4 | | |
| | **Tier** | Frontier | | |
| | **Creator** | Zen LM / Hanzo AI | | |
| ## Weights | |
| > Weights hosted at [MiniMaxAI/MiniMax-M1-80k](https://huggingface.co/MiniMaxAI/MiniMax-M1-80k) due to storage constraints. | |
| > Use the source repository for inference and fine-tuning. | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| model = AutoModelForCausalLM.from_pretrained("MiniMaxAI/MiniMax-M1-80k", torch_dtype="auto") | |
| tokenizer = AutoTokenizer.from_pretrained("MiniMaxAI/MiniMax-M1-80k") | |
| ``` | |
| ## Links | |
| - [Zen LM](https://zenlm.org) | [Hanzo AI](https://hanzo.ai) | [Hanzo Chat](https://hanzo.chat) | |
| - [All Zen Models](https://huggingface.co/zenlm) | |
| --- | |
| *Zen AI: Clarity Through Intelligence* | |