ECG-R1: Protocol-Guided and Modality-Agnostic MLLM for Reliable ECG Interpretation
If you find this project useful, please give us a starπ.
Jiarui Jin, Haoyu Wang, Xingliang Wu, Xiaocheng Fang, Xiang Lan, Zihan Wang
Deyun Zhang, Bo Liu, Yingying Zhang, Xian Wu, Hongyan Li, Shenda Hong
Introduction
Electrocardiography (ECG) serves as an indispensable diagnostic tool in clinical practice, yet existing multimodal large language models (MLLMs) remain unreliable for ECG interpretation, often producing plausible but clinically incorrect analyses. To address this, we propose ECG-R1, the first reasoning MLLM designed for reliable ECG interpretation via three innovations. First, we construct the interpretation corpus using Protocol-Guided Instruction Data Generation, grounding interpretation in measurable ECG features and monograph-defined quantitative thresholds and diagnostic logic. Second, we present a modality-decoupled architecture with Interleaved Modality Dropout to improve robustness and cross-modal consistency when either the ECG signal or ECG image is missing. Third, we present Reinforcement Learning with ECG Diagnostic Evidence Rewards to strengthen evidence-grounded ECG interpretation. Additionally, we systematically evaluate the ECG interpretation capabilities of proprietary, open-source, and medical MLLMs, and provide the first quantitative evidence that severe hallucinations are widespread, suggesting that the public should not directly trust these outputs without independent verification. Code and data are publicly available at Github and HuggingFace, and an online platform can be accessed at ECG-R1-Online-Platform.
Resource
Paper: π Arxiv
Github: β¨ Github
Model: π€ ECG-R1-8B
Data: π€ ECG-Protocol-Guided-Grounding-CoT
Citation
If you find ECG-R1 helpful for your research and applications, please cite our paper:
@misc{jin2026ecgr1,
title={ECG-R1: Protocol-Guided and Modality-Agnostic MLLM for Reliable ECG Interpretation},
author={Jiarui Jin and Haoyu Wang and Xingliang Wu and Xiaocheng Fang and Xiang Lan and Zihan Wang and Deyun Zhang and Bo Liu and Yingying Zhang and Xian Wu and Hongyan Li and Shenda Hong},
year={2026},
eprint={2602.04279},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2602.04279},
}
Acknowledgement
We thank the authors of PULSE, ECG-Chat, GEM, and Swift for their publicly released models, datasets, and training codes.
- Downloads last month
- 33
Model tree for PKUDigitalHealth/ECG-R1-8B-SFT
Base model
Qwen/Qwen3-VL-8B-Instruct