| tags: | |
| - text-generation | |
| - conversational | |
| - coding | |
| - agent | |
| - moe | |
| - large-language-model | |
| license: other | |
| license_name: modified-mit | |
| license_link: https://github.com/MiniMax-AI/MiniMax-M2/blob/main/LICENSE | |
| library_name: transformers | |
| pipeline_tag: text-generation | |
| # MiniMax-M2 | |
| MiniMax-M2 is a **Mini** model built for **Max** coding & agentic workflows. It's a compact, fast, and cost-effective MoE model (230 billion total parameters with 10 billion active parameters) built for elite performance in coding and agentic tasks. |