Add link to paper
#8
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -216,6 +216,7 @@ MiniCPM4 is pre-trained on 32K long texts and achieves length extension through
|
|
| 216 |
|
| 217 |
## Citation
|
| 218 |
- Please cite our [paper](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf) if you find our work valuable.
|
|
|
|
| 219 |
|
| 220 |
```bibtex
|
| 221 |
@article{minicpm4,
|
|
|
|
| 216 |
|
| 217 |
## Citation
|
| 218 |
- Please cite our [paper](https://github.com/OpenBMB/MiniCPM/tree/main/report/MiniCPM_4_Technical_Report.pdf) if you find our work valuable.
|
| 219 |
+
- See also the [paper page](https://huggingface.co/papers/2506.07900) on Hugging Face.
|
| 220 |
|
| 221 |
```bibtex
|
| 222 |
@article{minicpm4,
|