|
|
---
|
|
|
license: other
|
|
|
pipeline_tag: visual-question-answering
|
|
|
---
|
|
|
|
|
|
|
|
|
<p align="center">
|
|
|
<b><font size="6">ChartMoE</font></b>
|
|
|
<p>
|
|
|
|
|
|
<div align="center">
|
|
|
|
|
|
[Project Page](https://chartmoe.github.io/)
|
|
|
|
|
|
[Github Repo](https://github.com/IDEA-FinAI/ChartMoE)
|
|
|
|
|
|
</div>
|
|
|
|
|
|

|
|
|
|
|
|
**ChartMoE** is a multimodal large language model with Mixture-of-Expert connector, based on [InternLM-XComposer2](https://github.com/InternLM/InternLM-XComposer/tree/main/InternLM-XComposer-2.0) for advanced chart understanding, translation and editing.
|
|
|
|
|
|
|
|
|
## Import from Transformers
|
|
|
To load the ChartMoE model using Transformers, use the following code:
|
|
|
```python
|
|
|
import torch
|
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM
|
|
|
ckpt_path = "IDEA-FinAI/chartmoe"
|
|
|
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
|
|
|
model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).half().cuda().eval()
|
|
|
```
|
|
|
|
|
|
## Quickstart & Gradio Demo
|
|
|
We provide a simple example and a gradio webui demo to show how to use ChartMoE. Please refer to [https://github.com/IDEA-FinAI/ChartMoE](https://github.com/IDEA-FinAI/ChartMoE).
|
|
|
|
|
|
|
|
|
## Open Source License
|
|
|
The code is licensed under Apache-2.0. |