File size: 548 Bytes
ffb99ea
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---

tags:
- text-generation
- conversational
- coding
- agent
- moe
- large-language-model
license: other
license_name: modified-mit
license_link: https://github.com/MiniMax-AI/MiniMax-M2/blob/main/LICENSE
library_name: transformers
pipeline_tag: text-generation
---


# MiniMax-M2

MiniMax-M2 is a **Mini** model built for **Max** coding & agentic workflows. It's a compact, fast, and cost-effective MoE model (230 billion total parameters with 10 billion active parameters) built for elite performance in coding and agentic tasks.