File size: 615 Bytes
07153eb
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
license: apache-2.0
library_name: transformers
pipeline_tag: image-text-to-text
---

This repository contains the model presented in [Awaker2.5-VL: Stably Scaling MLLMs with Parameter-Efficient Mixture of Experts](https://huggingface.co/papers/2411.10669).

## Citation

```bibtex
@article{awaker2.5-vl,
    title     = {{Awaker2.5-VL}: Stably Scaling MLLMs with Parameter-Efficient Mixture of Experts},
    author    = {Jinqiang Long and Yanqi Dai and Guoxing Yang and Hongpeng Lin and Nanyi Fei and Yizhao Gao and Zhiwu Lu},    
    journal   = {arXiv preprint arXiv:2411.10669},
    year      = {2024} 
}
```