Update README.md
#7
by
m1ngcheng
- opened
README.md
CHANGED
|
@@ -11,7 +11,10 @@ library_name: transformers
|
|
| 11 |
<img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width="100"/>
|
| 12 |
<p>
|
| 13 |
|
| 14 |
-
<p align="center">π€ <a href="https://huggingface.co/inclusionAI">Hugging Face</a>   |   π€ <a href="https://modelscope.cn/organization/inclusionAI">ModelScope</a
|
|
|
|
|
|
|
|
|
|
| 15 |
|
| 16 |
Today, we officially release Ring-mini-2.0 β a high-performance inference-oriented MoE model deeply optimized based on the Ling 2.0 architecture. With only 16B total parameters and 1.4B activated parameters, it achieves comprehensive reasoning capabilities comparable to dense models below the 10B scale. It excels particularly in logical reasoning, code generation, and mathematical tasks, while supporting 128K long-context processing and 300+ tokens/s high-speed generation.
|
| 17 |
|
|
|
|
| 11 |
<img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width="100"/>
|
| 12 |
<p>
|
| 13 |
|
| 14 |
+
<p align="center">π€ <a href="https://huggingface.co/inclusionAI">Hugging Face</a>   |   π€ <a href="https://modelscope.cn/organization/inclusionAI">ModelScope</a>
|
| 15 |
+
| π <a href="https://zenmux.ai/inclusionai/ring-mini-2.0">Experience Now</a></p>
|
| 16 |
+
|
| 17 |
+
|
| 18 |
|
| 19 |
Today, we officially release Ring-mini-2.0 β a high-performance inference-oriented MoE model deeply optimized based on the Ling 2.0 architecture. With only 16B total parameters and 1.4B activated parameters, it achieves comprehensive reasoning capabilities comparable to dense models below the 10B scale. It excels particularly in logical reasoning, code generation, and mathematical tasks, while supporting 128K long-context processing and 300+ tokens/s high-speed generation.
|
| 20 |
|