|
|
--- |
|
|
license: apache-2.0 |
|
|
library_name: transformers |
|
|
pipeline_tag: image-text-to-text |
|
|
--- |
|
|
|
|
|
This repository contains the model presented in [Awaker2.5-VL: Stably Scaling MLLMs with Parameter-Efficient Mixture of Experts](https://huggingface.co/papers/2411.10669). |
|
|
|
|
|
## Citation |
|
|
|
|
|
```bibtex |
|
|
@article{awaker2.5-vl, |
|
|
title = {{Awaker2.5-VL}: Stably Scaling MLLMs with Parameter-Efficient Mixture of Experts}, |
|
|
author = {Jinqiang Long and Yanqi Dai and Guoxing Yang and Hongpeng Lin and Nanyi Fei and Yizhao Gao and Zhiwu Lu}, |
|
|
journal = {arXiv preprint arXiv:2411.10669}, |
|
|
year = {2024} |
|
|
} |
|
|
``` |