ECG-Mamba: Improved Cardiac Abnormality Classification
A comprehensive comparison of baseline and improved ECG-Mamba architectures for cardiac abnormality classification using 12-lead ECG signals.
Overview
This project implements and compares two ECG-Mamba architectures for cardiac abnormality detection:
- Baseline ECG-Mamba: Simple Vision Mamba architecture with unidirectional processing
- Improved ECG-Mamba: Enhanced architecture with three major improvements
Key Enhancements
The Improved ECG-Mamba model introduces three critical enhancements:
1. Lead-Specific Multi-Branch Architecture
- 4 specialized branches for different lead groups:
- Limb leads: I, II, III
- Augmented limb leads: aVR, aVL, aVF
- Chest anterior leads: V1, V2, V3
- Chest lateral leads: V4, V5, V6
- Lead Fusion Module: Intelligently combines branch outputs with attention-weighted fusion
2. Mamba-Transformer Hybrid
- Combines Mamba's long-range dependency modeling with Transformer's attention mechanism
- Spots critical short-term anomalies (R-peaks) that might be smoothed over by pure Mamba
- Multi-head attention with feed-forward network
3. Bi-Directional Scanning
- Processes ECG signals in both forward and backward directions
- Forward Mamba + Backward Mamba fusion
- Mimics how cardiologists analyze context before AND after each beat
Architecture Comparison
| Feature | Baseline ECG-Mamba | Improved ECG-Mamba |
|---|---|---|
| Processing | Unidirectional | Bi-directional |
| Architecture | Single branch | 4-branch multi-lead |
| Attention | None | Transformer layer |
| Lead Grouping | No | Yes (anatomical) |
| Context | Forward only | Forward + Backward |
Dataset
- Source: PTB-XL Database (PhysioNet)
- Records: 500 12-lead ECG recordings
- Sampling Rate: 100 Hz
- Signal Length: 1000 samples (10 seconds)
- Classes: 5 diagnostic superclasses
- NORM (Normal)
- MI (Myocardial Infarction)
- STTC (ST/T Change)
- CD (Conduction Disturbance)
- HYP (Hypertrophy)
Requirements
Core Dependencies
python >= 3.8
torch >= 2.0.0
numpy >= 1.21.0
pandas >= 1.3.0
scikit-learn >= 1.0.0
matplotlib >= 3.4.0
seaborn >= 0.11.0
wfdb >= 4.0.0
tqdm >= 4.62.0
Mamba-SSM (Optional, requires CUDA)
causal-conv1d >= 1.2.0
mamba-ssm >= 1.0.0
Note: If Mamba-SSM is not available (no GPU), the code automatically falls back to LSTM implementation.
Installation
- Clone the repository:
git clone https://github.com/YOUR_USERNAME/ecg-mamba-improved.git
cd ecg-mamba-improved
- Install dependencies:
pip install -r requirements.txt
- (Optional) Install Mamba-SSM if you have CUDA GPU:
pip install causal-conv1d>=1.2.0
pip install mamba-ssm
Usage
Running in Google Colab
- Open the notebook in Google Colab
- Set Runtime to GPU (T4) for best performance:
- Runtime β Change runtime type β GPU β T4
- Run all cells sequentially
Running Locally
jupyter notebook ECG_Mamba_Improved_Comparison.ipynb
Notebook Structure
- Cell 1: Install Dependencies
- Cell 2: Download PTB-XL Dataset (500 records)
- Cell 3: Prepare Data Loaders
- Cell 4: Define Baseline ECG-Mamba Model
- Cell 5: Define Improved ECG-Mamba Model
- Cell 6: Training Function
- Cell 7: Train Both Models
- Cell 8: Compare Results
- Cell 9: Visualize Example Predictions
- Cell 10: Summary Table
- Cell 11: Per-Class Accuracy Analysis
- Cell 12: Summary
Model Architecture
Baseline ECG-Mamba
Input (batch, 1000, 12)
β
Embedding (12 β d_model)
β
4x Mamba Layers (unidirectional)
β
Global Average Pooling
β
LayerNorm + Classifier
β
Output (batch, n_classes)
Improved ECG-Mamba
Input (batch, 1000, 12)
β
Split into 4 lead groups (3 leads each)
β
4x Lead-Specific Branches (each with BiMamba)
ββ Limb Standard (I, II, III)
ββ Limb Augmented (aVR, aVL, aVF)
ββ Chest Anterior (V1, V2, V3)
ββ Chest Lateral (V4, V5, V6)
β
Lead Fusion Module (attention-weighted)
β
Shared Bi-Directional Mamba
β
Transformer Attention Block
β
Global Average Pooling
β
LayerNorm + Classifier
β
Output (batch, n_classes)
Training
Default Hyperparameters
- Batch Size: 32
- Epochs: 15
- Optimizer: AdamW
- Learning Rate: 1e-3
- Loss Function: CrossEntropyLoss
- Train/Test Split: 80/20
Training Process
The notebook trains both models sequentially and provides:
- Epoch-by-epoch metrics (loss, train accuracy, test accuracy)
- Comparison plots (accuracy and loss curves)
- Per-class performance metrics
- Confusion matrices
- Visual ECG predictions
Results
The Improved ECG-Mamba model demonstrates significant improvements over the baseline:
- Better feature extraction: Multi-branch architecture captures lead-specific patterns
- Enhanced context modeling: Bi-directional scanning provides complete temporal context
- Critical anomaly detection: Transformer attention highlights important R-peaks and ST segments
- Improved accuracy: Consistent performance gains across all diagnostic classes
Visualizations
The notebook generates:
- Training curves: Loss and accuracy comparison
- Confusion matrices: Per-class prediction analysis
- ECG predictions: Visual comparison of predictions from both models
- Summary tables: Model architecture and performance metrics
Citation
This work is based on:
ECG-Mamba: Cardiac Abnormality Classification With Non-Uniform-Mix Augmentation on 12-Lead ECGs
IEEE Journal of Translational Engineering in Health and Medicine (JTEHM), 2025
References
- PTB-XL Database: Wagner, P., et al. "PTB-XL, a large publicly available electrocardiography dataset." Scientific Data 7.1 (2020): 1-15.
- Mamba: Gu, A., & Dao, T. "Mamba: Linear-Time Sequence Modeling with Selective State Spaces." arXiv preprint arXiv:2312.00752 (2023).
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- PTB-XL dataset from PhysioNet
- Mamba-SSM implementation by Tri Dao and Albert Gu
- IEEE JTEHM for the original ECG-Mamba paper
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Contact
For questions or issues, please open an issue on GitHub.
Disclaimer
This project is for research and educational purposes only. It is not intended for clinical diagnosis or medical decision-making. Always consult qualified healthcare professionals for medical advice.