Improve model card: Update paper link, add project page and relevant tags

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +12 -6
README.md CHANGED
@@ -1,25 +1,31 @@
1
  ---
2
- license: apache-2.0
3
  language:
4
  - zh
5
  - en
6
- pipeline_tag: text-generation
7
  library_name: transformers
 
 
 
 
 
8
  ---
 
9
  <div align="center">
10
  <img src="https://github.com/OpenBMB/MiniCPM/blob/main/assets/minicpm_logo.png?raw=true" width="500em" ></img>
11
  </div>
12
 
13
  <p align="center">
14
  <a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
15
- <a href="https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf" target="_blank">Technical Report</a>
 
 
16
  </p>
17
  <p align="center">
18
  πŸ‘‹ Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
19
  </p>
20
 
21
  ## What's New
22
- - [2025.06.06] **MiniCPM4** series are released! This model achieves ultimate efficiency improvements while maintaining optimal performance at the same scale! It can achieve over 5x generation acceleration on typical end-side chips! You can find the technical report [here](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf).πŸ”₯πŸ”₯πŸ”₯
23
  - [2025.06.09] **MiniCPM4-8B-mlx** and **MiniCPM4-0.5B-mlx** are available and you can run MiniCPM4 on your Apple devices! Thanks to [pzc163](https://huggingface.co/pzc163) for providing this converted model version and related usage instructions.
24
 
25
  ## MiniCPM4 Series
@@ -110,7 +116,7 @@ MiniCPM4 is pre-trained on 32K long texts and achieves length extension through
110
  - This repository and MiniCPM models are released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
111
 
112
  ## Citation
113
- - Please cite our [paper](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf) if you find our work valuable.
114
 
115
  ```bibtex
116
  @article{minicpm4,
@@ -118,4 +124,4 @@ MiniCPM4 is pre-trained on 32K long texts and achieves length extension through
118
  author={MiniCPM Team},
119
  year={2025}
120
  }
121
- ```
 
1
  ---
 
2
  language:
3
  - zh
4
  - en
 
5
  library_name: transformers
6
+ license: apache-2.0
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - tool-use
10
+ - long-context
11
  ---
12
+
13
  <div align="center">
14
  <img src="https://github.com/OpenBMB/MiniCPM/blob/main/assets/minicpm_logo.png?raw=true" width="500em" ></img>
15
  </div>
16
 
17
  <p align="center">
18
  <a href="https://github.com/OpenBMB/MiniCPM/" target="_blank">GitHub Repo</a> |
19
+ <a href="https://huggingface.co/papers/2506.07900" target="_blank">Paper</a> |
20
+ <a href="https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf" target="_blank">Technical Report</a> |
21
+ <a href="https://huggingface.co/collections/openbmb/minicpm4-6841ab29d180257e940baa9b" target="_blank">Project Page</a>
22
  </p>
23
  <p align="center">
24
  πŸ‘‹ Join us on <a href="https://discord.gg/3cGQn9b3YM" target="_blank">Discord</a> and <a href="https://github.com/OpenBMB/MiniCPM/blob/main/assets/wechat.jpg" target="_blank">WeChat</a>
25
  </p>
26
 
27
  ## What's New
28
+ - [2025.06.06] **MiniCPM4** series are released! This model achieves ultimate efficiency improvements while maintaining optimal performance at the same scale! It can achieve over 5x generation acceleration on typical end-side chips! You can find the technical report [here](https://huggingface.co/papers/2506.07900).πŸ”₯πŸ”₯πŸ”₯
29
  - [2025.06.09] **MiniCPM4-8B-mlx** and **MiniCPM4-0.5B-mlx** are available and you can run MiniCPM4 on your Apple devices! Thanks to [pzc163](https://huggingface.co/pzc163) for providing this converted model version and related usage instructions.
30
 
31
  ## MiniCPM4 Series
 
116
  - This repository and MiniCPM models are released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
117
 
118
  ## Citation
119
+ - Please cite our [paper](https://huggingface.co/papers/2506.07900) if you find our work valuable.
120
 
121
  ```bibtex
122
  @article{minicpm4,
 
124
  author={MiniCPM Team},
125
  year={2025}
126
  }
127
+ ```