YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

ECG-Mamba: Improved Cardiac Abnormality Classification

License: MIT Python 3.8+ PyTorch

A comprehensive comparison of baseline and improved ECG-Mamba architectures for cardiac abnormality classification using 12-lead ECG signals.

Overview

This project implements and compares two ECG-Mamba architectures for cardiac abnormality detection:

  • Baseline ECG-Mamba: Simple Vision Mamba architecture with unidirectional processing
  • Improved ECG-Mamba: Enhanced architecture with three major improvements

Key Enhancements

The Improved ECG-Mamba model introduces three critical enhancements:

1. Lead-Specific Multi-Branch Architecture

  • 4 specialized branches for different lead groups:
    • Limb leads: I, II, III
    • Augmented limb leads: aVR, aVL, aVF
    • Chest anterior leads: V1, V2, V3
    • Chest lateral leads: V4, V5, V6
  • Lead Fusion Module: Intelligently combines branch outputs with attention-weighted fusion

2. Mamba-Transformer Hybrid

  • Combines Mamba's long-range dependency modeling with Transformer's attention mechanism
  • Spots critical short-term anomalies (R-peaks) that might be smoothed over by pure Mamba
  • Multi-head attention with feed-forward network

3. Bi-Directional Scanning

  • Processes ECG signals in both forward and backward directions
  • Forward Mamba + Backward Mamba fusion
  • Mimics how cardiologists analyze context before AND after each beat

Architecture Comparison

Feature Baseline ECG-Mamba Improved ECG-Mamba
Processing Unidirectional Bi-directional
Architecture Single branch 4-branch multi-lead
Attention None Transformer layer
Lead Grouping No Yes (anatomical)
Context Forward only Forward + Backward

Dataset

  • Source: PTB-XL Database (PhysioNet)
  • Records: 500 12-lead ECG recordings
  • Sampling Rate: 100 Hz
  • Signal Length: 1000 samples (10 seconds)
  • Classes: 5 diagnostic superclasses
    • NORM (Normal)
    • MI (Myocardial Infarction)
    • STTC (ST/T Change)
    • CD (Conduction Disturbance)
    • HYP (Hypertrophy)

Requirements

Core Dependencies

python >= 3.8
torch >= 2.0.0
numpy >= 1.21.0
pandas >= 1.3.0
scikit-learn >= 1.0.0
matplotlib >= 3.4.0
seaborn >= 0.11.0
wfdb >= 4.0.0
tqdm >= 4.62.0

Mamba-SSM (Optional, requires CUDA)

causal-conv1d >= 1.2.0
mamba-ssm >= 1.0.0

Note: If Mamba-SSM is not available (no GPU), the code automatically falls back to LSTM implementation.

Installation

  1. Clone the repository:
git clone https://github.com/YOUR_USERNAME/ecg-mamba-improved.git
cd ecg-mamba-improved
  1. Install dependencies:
pip install -r requirements.txt
  1. (Optional) Install Mamba-SSM if you have CUDA GPU:
pip install causal-conv1d>=1.2.0
pip install mamba-ssm

Usage

Running in Google Colab

  1. Open the notebook in Google Colab
  2. Set Runtime to GPU (T4) for best performance:
    • Runtime β†’ Change runtime type β†’ GPU β†’ T4
  3. Run all cells sequentially

Running Locally

jupyter notebook ECG_Mamba_Improved_Comparison.ipynb

Notebook Structure

  1. Cell 1: Install Dependencies
  2. Cell 2: Download PTB-XL Dataset (500 records)
  3. Cell 3: Prepare Data Loaders
  4. Cell 4: Define Baseline ECG-Mamba Model
  5. Cell 5: Define Improved ECG-Mamba Model
  6. Cell 6: Training Function
  7. Cell 7: Train Both Models
  8. Cell 8: Compare Results
  9. Cell 9: Visualize Example Predictions
  10. Cell 10: Summary Table
  11. Cell 11: Per-Class Accuracy Analysis
  12. Cell 12: Summary

Model Architecture

Baseline ECG-Mamba

Input (batch, 1000, 12)
    ↓
Embedding (12 β†’ d_model)
    ↓
4x Mamba Layers (unidirectional)
    ↓
Global Average Pooling
    ↓
LayerNorm + Classifier
    ↓
Output (batch, n_classes)

Improved ECG-Mamba

Input (batch, 1000, 12)
    ↓
Split into 4 lead groups (3 leads each)
    ↓
4x Lead-Specific Branches (each with BiMamba)
    β”œβ”€ Limb Standard (I, II, III)
    β”œβ”€ Limb Augmented (aVR, aVL, aVF)
    β”œβ”€ Chest Anterior (V1, V2, V3)
    └─ Chest Lateral (V4, V5, V6)
    ↓
Lead Fusion Module (attention-weighted)
    ↓
Shared Bi-Directional Mamba
    ↓
Transformer Attention Block
    ↓
Global Average Pooling
    ↓
LayerNorm + Classifier
    ↓
Output (batch, n_classes)

Training

Default Hyperparameters

  • Batch Size: 32
  • Epochs: 15
  • Optimizer: AdamW
  • Learning Rate: 1e-3
  • Loss Function: CrossEntropyLoss
  • Train/Test Split: 80/20

Training Process

The notebook trains both models sequentially and provides:

  • Epoch-by-epoch metrics (loss, train accuracy, test accuracy)
  • Comparison plots (accuracy and loss curves)
  • Per-class performance metrics
  • Confusion matrices
  • Visual ECG predictions

Results

The Improved ECG-Mamba model demonstrates significant improvements over the baseline:

  • Better feature extraction: Multi-branch architecture captures lead-specific patterns
  • Enhanced context modeling: Bi-directional scanning provides complete temporal context
  • Critical anomaly detection: Transformer attention highlights important R-peaks and ST segments
  • Improved accuracy: Consistent performance gains across all diagnostic classes

Visualizations

The notebook generates:

  • Training curves: Loss and accuracy comparison
  • Confusion matrices: Per-class prediction analysis
  • ECG predictions: Visual comparison of predictions from both models
  • Summary tables: Model architecture and performance metrics

Citation

This work is based on:

ECG-Mamba: Cardiac Abnormality Classification With Non-Uniform-Mix Augmentation on 12-Lead ECGs
IEEE Journal of Translational Engineering in Health and Medicine (JTEHM), 2025

References

  • PTB-XL Database: Wagner, P., et al. "PTB-XL, a large publicly available electrocardiography dataset." Scientific Data 7.1 (2020): 1-15.
  • Mamba: Gu, A., & Dao, T. "Mamba: Linear-Time Sequence Modeling with Selective State Spaces." arXiv preprint arXiv:2312.00752 (2023).

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • PTB-XL dataset from PhysioNet
  • Mamba-SSM implementation by Tri Dao and Albert Gu
  • IEEE JTEHM for the original ECG-Mamba paper

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Contact

For questions or issues, please open an issue on GitHub.

Disclaimer

This project is for research and educational purposes only. It is not intended for clinical diagnosis or medical decision-making. Always consult qualified healthcare professionals for medical advice.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support