Improve model card: Add metadata, paper, and GitHub links
Browse filesThis PR enhances the model card for Kimi Linear by:
- Adding `pipeline_tag: text-generation` for better discoverability on the Hub.
- Adding `library_name: transformers` to enable the automated "How to use" widget, as the model is compatible with the `transformers` library using `trust_remote_code=True`.
- Updating the main paper link in the banner and "Tech Report" badge to point directly to the Hugging Face paper page: [Kimi Linear: An Expressive, Efficient Attention Architecture](https://huggingface.co/papers/2510.26692).
- Adding a prominent GitHub badge linking to the official repository: [https://github.com/MoonshotAI/Kimi-Linear](https://github.com/MoonshotAI/Kimi-Linear).
- Updating the BibTeX citation to include a full author list and arXiv identifier, as found in the GitHub repository.
Please review and merge if these improvements align with your expectations.
|
@@ -1,13 +1,17 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
| 4 |
<div align="center">
|
| 5 |
-
<a href="https://
|
| 6 |
</div>
|
| 7 |
|
| 8 |
<div align="center">
|
| 9 |
-
<a href="https://
|
| 10 |
-
<a href="https://huggingface.co/moonshotai/Kimi-Linear-48B-A3B-Instruct"><img src="https://huggingface.co/front/assets/huggingface_logo-noborder.svg" height="16" width="16" style="display: inline-block; vertical-align: middle; margin: 2px;"><b style="display: inline-block;"> HuggingFace</b></a>
|
|
|
|
| 11 |
</div>
|
| 12 |
|
| 13 |
<div align="center">
|
|
@@ -98,10 +102,12 @@ vllm serve moonshotai/Kimi-Linear-48B-A3B-Instruct \
|
|
| 98 |
|
| 99 |
If you found our work useful, please cite
|
| 100 |
```bibtex
|
| 101 |
-
@
|
| 102 |
-
|
| 103 |
-
|
| 104 |
-
|
| 105 |
-
|
|
|
|
|
|
|
| 106 |
}
|
| 107 |
```
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
+
pipeline_tag: text-generation
|
| 4 |
+
library_name: transformers
|
| 5 |
---
|
| 6 |
+
|
| 7 |
<div align="center">
|
| 8 |
+
<a href="https://huggingface.co/papers/2510.26692"><img width="80%" src="figures/banner.png"></a>
|
| 9 |
</div>
|
| 10 |
|
| 11 |
<div align="center">
|
| 12 |
+
<a href="https://huggingface.co/papers/2510.26692" ><img src="figures/logo.png" height="16" width="16" style="display: inline-block; vertical-align: middle; margin: 2px;"><b style="display: inline-block;"> Tech Report</b></a> |
|
| 13 |
+
<a href="https://huggingface.co/moonshotai/Kimi-Linear-48B-A3B-Instruct"><img src="https://huggingface.co/front/assets/huggingface_logo-noborder.svg" height="16" width="16" style="display: inline-block; vertical-align: middle; margin: 2px;"><b style="display: inline-block;"> HuggingFace</b></a> |
|
| 14 |
+
<a href="https://github.com/MoonshotAI/Kimi-Linear"><img src="https://img.shields.io/badge/Github-Code-blue.svg?logo=github&style=flat-square" height="16" width="16" style="display: inline-block; vertical-align: middle; margin: 2px;"><b style="display: inline-block;"> GitHub</b></a>
|
| 15 |
</div>
|
| 16 |
|
| 17 |
<div align="center">
|
|
|
|
| 102 |
|
| 103 |
If you found our work useful, please cite
|
| 104 |
```bibtex
|
| 105 |
+
@misc{team2025kimi,
|
| 106 |
+
title = {Kimi Linear: An Expressive, Efficient Attention Architecture},
|
| 107 |
+
author = {Zhang, Yu and Lin, Zongyu and Yao, Xingcheng and Hu, Jiaxi and Meng, Fanqing and Liu, Chengyin and Men, Xin and Yang, Songlin and Li, Zhiyuan and Li, Wentao and Lu, Enzhe and Liu, Weizhou and Chen, Yanru and Xu, Weixin and Yu, Longhui and Wang, Yejie and Fan, Yu and Zhong, Longguang and Yuan, Enming and Zhang, Dehao and Zhang, Yizhi and T. Liu, Y. and Wang, Haiming and Fang, Shengjun and He, Weiran and Liu, Shaowei and Li, Yiwei and Su, Jianlin and Qiu, Jiezhong and Pang, Bo and Yan, Junjie and Jiang, Zhejun and Huang, Weixiao and Yin, Bohong and You, Jiacheng and Wei, Chu and Wang, Zhengtao and Hong, Chao and Chen, Yutian and Chen, Guanduo and Wang, Yucheng and Zheng, Huabin and Wang, Feng and Liu, Yibo and Dong, Mengnan and Zhang, Zheng and Pan, Siyuan and Wu, Wenhao and Wu, Yuhao and Guan, Longyu and Tao, Jiawen and Fu, Guohong and Xu, Xinran and Wang, Yuzhi and Lai, Guokun and Wu, Yuxin and Zhou, Xinyu and Yang, Zhilin and Du, Yulun},
|
| 108 |
+
year = {2025},
|
| 109 |
+
eprint = {2510.26692},
|
| 110 |
+
archivePrefix = {arXiv},
|
| 111 |
+
primaryClass = {cs.CL}
|
| 112 |
}
|
| 113 |
```
|