ChartMoE: Mixture of Expert Connector for Advanced Chart Understanding
Paper • 2409.03277 • Published • 1
# Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("IDEA-FinAI/chartmoe", trust_remote_code=True, dtype="auto")ChartMoE
ICLR2025 Oral
ChartMoE is a multimodal large language model with Mixture-of-Expert connector, based on InternLM-XComposer2 for advanced chart 1)understanding, 2)replot, 3)editing, 4)highlighting and 5)transformation.
To load the ChartMoE model using Transformers, use the following code:
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
ckpt_path = "IDEA-FinAI/chartmoe"
tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(ckpt_path, trust_remote_code=True).half().cuda().eval()
We provide a simple example and a gradio webui demo to show how to use ChartMoE. Please refer to https://github.com/IDEA-FinAI/ChartMoE.
The code is licensed under Apache-2.0.
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-text-to-text", model="IDEA-FinAI/chartmoe", trust_remote_code=True)