Update README.md
Browse files
README.md
CHANGED
|
@@ -47,6 +47,16 @@ AdaMLLM represents our latest advancement in building domain-specific foundation
|
|
| 47 |
## Citation
|
| 48 |
If you find our work helpful, please cite us.
|
| 49 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
[AdaptLLM](https://huggingface.co/papers/2309.09530) (ICLR 2024)
|
| 51 |
```bibtex
|
| 52 |
@inproceedings{
|
|
|
|
| 47 |
## Citation
|
| 48 |
If you find our work helpful, please cite us.
|
| 49 |
|
| 50 |
+
AdaMLLM
|
| 51 |
+
```bibtex
|
| 52 |
+
@article{adamllm,
|
| 53 |
+
title={On Domain-Specific Post-Training for Multimodal Large Language Models},
|
| 54 |
+
author={Cheng, Daixuan and Huang, Shaohan and Zhu, Ziyu and Zhang, Xintong and Zhao, Wayne Xin and Luan, Zhongzhi and Dai, Bo and Zhang, Zhenliang},
|
| 55 |
+
journal={arXiv preprint arXiv:2411.19930},
|
| 56 |
+
year={2024}
|
| 57 |
+
}
|
| 58 |
+
```
|
| 59 |
+
|
| 60 |
[AdaptLLM](https://huggingface.co/papers/2309.09530) (ICLR 2024)
|
| 61 |
```bibtex
|
| 62 |
@inproceedings{
|