File size: 7,075 Bytes
e086603
 
 
 
 
 
 
 
 
 
 
 
 
21494a2
 
 
 
 
a886852
21494a2
a886852
21494a2
 
 
 
 
 
a886852
 
21494a2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a886852
21494a2
 
 
 
 
 
 
 
 
 
 
 
a886852
 
 
 
 
 
 
 
 
 
21494a2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a886852
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
---
title: Echotracker
emoji: πŸƒ
colorFrom: gray
colorTo: green
sdk: gradio
sdk_version: 6.12.0
app_file: app.py
pinned: false
license: mit
short_description: To run EchoTracker instantly on a custom or given videos.
---

# πŸ«€ EchoTracker

**Advancing Myocardial Point Tracking in Echocardiography**

[![MICCAI 2024](https://img.shields.io/badge/MICCAI-2024-blue)](https://link.springer.com/chapter/10.1007/978-3-031-72083-3_60)
[![ICCV 2025 Workshop](https://img.shields.io/badge/ICCV%20Workshop-2025-blue)](https://openaccess.thecvf.com/content/ICCV2025W/CVAMD/papers/Azad_Taming_Modern_Point_Tracking_for_Speckle_Tracking_Echocardiography_via_Impartial_CVAMD_2025_paper.pdf)
[![arXiv](https://img.shields.io/badge/arXiv-2405.08587-red)](https://arxiv.org/abs/2405.08587)
[![arXiv](https://img.shields.io/badge/arXiv-2507.10127-red)](https://arxiv.org/abs/2507.10127)
[![GitHub](https://img.shields.io/badge/GitHub-riponazad%2Fechotracker-black)](https://github.com/riponazad/echotracker)
[![Project Page](https://img.shields.io/badge/Project-Page-purple)](https://riponazad.github.io/echotracker/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](LICENSE)

EchoTracker is an interactive demo for tracking user-selected points on cardiac tissue across echocardiography video sequences. It was presented at **MICCAI 2024** and demonstrates strong generalisation across cardiac views and scanner types β€” including out-of-distribution settings not seen during training.

The model weights used in this demo are from the follow-up work **"Taming Modern Point Tracking for Speckle Tracking Echocardiography via Impartial Motion"** (ICCV 2025 Workshop), which further advances robustness and accuracy for speckle tracking in echocardiography.

---

## Demo

Try the live demo on Hugging Face Spaces: [EchoTracker Space](https://huggingface.co/spaces/riponazad/echotracker)

---

## Features

- **Interactive point selection** β€” click directly on a video frame to place up to 100 tracking points on cardiac structures (e.g. LV/RV walls, myocardium)
- **Frame navigation** β€” scrub through frames with a slider to choose the optimal query frame (end-diastolic recommended)
- **Multi-view support** β€” handles A4C (apical 4-chamber), RV (right ventricle), and PSAX (parasternal short-axis) views
- **Out-of-distribution (OOD) generalisation** β€” tested on scanner types and views not seen during training
- **Faded trajectory visualisation** β€” output video overlays colour-coded tracks with fade-trail rendering
- **Built-in examples** β€” four bundled clips (A4C, A4C OOD, RV OOD, PSAX OOD) for instant testing

---

## How to Use

1. **Load a video** β€” upload your own echocardiography clip or click one of the provided example thumbnails.
2. **Navigate to the query frame** β€” use the frame slider to find the desired starting frame. The end-diastolic frame is recommended for best results.
3. **Place tracking points** β€” click anywhere on the frame image to add a point. Up to **100 points** are supported per run.
4. **Adjust selection** β€” use **Undo** to remove the last point or **Clear All** to start over.
5. **Run the tracker** β€” press **β–Ά Run EchoTracker** to generate trajectories for all selected points.
6. **View output** β€” the annotated video with colour-coded tracks appears in the output player.

> **Tip:** Points are stored as `(x, y)` pixel coordinates on the original frame and are automatically rescaled to the model's 256 Γ— 256 input resolution.

---

## Running Locally

### Prerequisites

- Python 3.10+
- A CUDA-capable GPU (optional but recommended; CPU inference is supported)

### Installation

```bash
git clone https://github.com/riponazad/echotracker.git
cd echotracker
pip install gradio torch opencv-python-headless numpy Pillow mediapy scikit-image
```

### Launch

```bash
python app.py
```

The Gradio interface will be available at `http://localhost:7860`.

### Model Weights

The pre-trained TorchScript model (`echotracker_cvamd_ts.pt`) must be present in the project root. It is included in this repository/Space and loaded automatically at startup.

---

## Repository Structure

```
echotracker/
β”œβ”€β”€ app.py                    # Gradio application and UI
β”œβ”€β”€ utils.py                  # Point-to-tensor conversion and tracking visualisation
β”œβ”€β”€ echotracker_cvamd_ts.pt   # Pre-trained TorchScript model weights
β”œβ”€β”€ example_samples/          # Bundled example echocardiography clips
β”‚   β”œβ”€β”€ input1.mp4            # A4C view
β”‚   β”œβ”€β”€ input2.mp4            # A4C view (OOD)
β”‚   β”œβ”€β”€ input3_RV.mp4         # RV view (OOD)
β”‚   └── psax_video_crop.mp4   # PSAX view (OOD)
└── outputs/                  # Saved tracking output videos
```

---

## Technical Details

| Property | Value |
|---|---|
| Model format | TorchScript (`.pt`) |
| Input resolution | 256 Γ— 256 (grayscale) |
| Max tracking points | 100 |
| Output video FPS | 25 |
| Supported views | A4C, RV, PSAX |
| Device | CUDA (auto) or CPU |

The tracker receives a batch of grayscale frames of shape `[B, T, 1, H, W]` and a set of query points `[B, N, 3]` (frame index, x, y). It returns per-point trajectories that are denormalised and overlaid on the original-resolution frames.

---

## Citation

If you use EchoTracker or the model weights in this demo, please cite both papers:

```bibtex
@InProceedings{azad2024echotracker,
    author    = {Azad, Md Abulkalam and Chernyshov, Artem and Nyberg, John
                 and Tveten, Ingrid and Lovstakken, Lasse and Dalen, H{\aa}vard
                 and Grenne, Bj{\o}rnar and {\O}stvik, Andreas},
    title     = {EchoTracker: Advancing Myocardial Point Tracking in Echocardiography},
    booktitle = {Medical Image Computing and Computer Assisted Intervention -- MICCAI 2024},
    year      = {2024},
    publisher = {Springer Nature Switzerland},
    doi       = {10.1007/978-3-031-72083-3_60}
}

@InProceedings{Azad_2025_ICCV,
    author    = {Azad, Md Abulkalam and Nyberg, John and Dalen, H{\aa}vard
                 and Grenne, Bj{\o}rnar and Lovstakken, Lasse and {\O}stvik, Andreas},
    title     = {Taming Modern Point Tracking for Speckle Tracking Echocardiography via Impartial Motion},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops},
    month     = {October},
    year      = {2025},
    pages     = {1115--1124}
}
```

---

## Authors

Md Abulkalam Azad, Artem Chernyshov, John Nyberg, Ingrid Tveten, Lasse Lovstakken, HΓ₯vard Dalen, BjΓΈrnar Grenne, Andreas Østvik

---

## License

This project is licensed under the [MIT License](LICENSE).

> **Note:** The bundled example echocardiography clips are provided for demonstration purposes only and should not be downloaded, reproduced, or used outside this demo.

---

## Support This Work

If you find EchoTracker useful, please consider clicking the **Like** button at the top of this Space. It helps others discover this work and supports continued open research in cardiac image analysis.