Improve model card: Add pipeline tag, correct license, and refine paper link
#1
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -1,14 +1,16 @@
|
|
| 1 |
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
datasets:
|
| 4 |
- Vranlee/MFT25
|
| 5 |
language:
|
| 6 |
- en
|
|
|
|
|
|
|
| 7 |
---
|
|
|
|
| 8 |
# When Trackers Date Fish: A Benchmark and Framework for Underwater Multiple Fish Tracking
|
| 9 |
|
| 10 |
The official implementation of the paper:
|
| 11 |
-
> [**When Trackers Date Fish: A Benchmark and Framework for Underwater Multiple Fish Tracking**](https://
|
| 12 |
> Weiran Li, Yeqiang Liu, Qiannan Guo, Yijie Wei, Hwa Liang Leo, Zhenbo Li*
|
| 13 |
> [**\[Project\]**](https://vranlee.github.io/SU-T/) [**\[Paper\]**](https://arxiv.org/abs/2507.06400) [**\[Code\]**](https://github.com/vranlee/SU-T)
|
| 14 |
|
|
@@ -28,9 +30,6 @@ The official implementation of the paper:
|
|
| 28 |
+ [2025.04] We have released the MFT25 dataset and codes of SU-T!
|
| 29 |
-----
|
| 30 |
|
| 31 |
-
## 💡Abstract
|
| 32 |
-
Multiple object tracking (MOT) technology has made significant progress in terrestrial applications, but underwater tracking scenarios remain underexplored despite their importance to marine ecology and aquaculture. We present Multiple Fish Tracking Dataset 2025 (MFT25), the first comprehensive dataset specifically designed for underwater multiple fish tracking, featuring 15 diverse video sequences with 408,578 meticulously annotated bounding boxes across 48,066 frames. Our dataset captures various underwater environments, fish species, and challenging conditions including occlusions, similar appearances, and erratic motion patterns. Additionally, we introduce Scale-aware and Unscented Tracker (SU-T), a specialized tracking framework featuring an Unscented Kalman Filter (UKF) optimized for non-linear fish swimming patterns and a novel Fish-Intersection-over-Union (FishIoU) matching that accounts for the unique morphological characteristics of aquatic species. Extensive experiments demonstrate that our SU-T baseline achieves state-of-the-art performance on MFT25, with 34.1 HOTA and 44.6 IDF1, while revealing fundamental differences between fish tracking and terrestrial object tracking scenarios. MFT25 establishes a robust foundation for advancing research in underwater tracking systems with important applications in marine biology, aquaculture monitoring, and ecological conservation.
|
| 33 |
-
|
| 34 |
## 🏆Contributions
|
| 35 |
|
| 36 |
+ We introduce MFT25, the first comprehensive multiple fish tracking dataset featuring 15 diverse video sequences with 408,578 meticulously annotated bounding boxes across 48,066 frames, capturing various underwater environments, fish species, and challenging conditions including occlusions, rapid direction changes, and visually similar appearances.
|
|
|
|
| 1 |
---
|
|
|
|
| 2 |
datasets:
|
| 3 |
- Vranlee/MFT25
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
+
license: mit
|
| 7 |
+
pipeline_tag: object-detection
|
| 8 |
---
|
| 9 |
+
|
| 10 |
# When Trackers Date Fish: A Benchmark and Framework for Underwater Multiple Fish Tracking
|
| 11 |
|
| 12 |
The official implementation of the paper:
|
| 13 |
+
> [**When Trackers Date Fish: A Benchmark and Framework for Underwater Multiple Fish Tracking**](https://huggingface.co/papers/2507.06400)
|
| 14 |
> Weiran Li, Yeqiang Liu, Qiannan Guo, Yijie Wei, Hwa Liang Leo, Zhenbo Li*
|
| 15 |
> [**\[Project\]**](https://vranlee.github.io/SU-T/) [**\[Paper\]**](https://arxiv.org/abs/2507.06400) [**\[Code\]**](https://github.com/vranlee/SU-T)
|
| 16 |
|
|
|
|
| 30 |
+ [2025.04] We have released the MFT25 dataset and codes of SU-T!
|
| 31 |
-----
|
| 32 |
|
|
|
|
|
|
|
|
|
|
| 33 |
## 🏆Contributions
|
| 34 |
|
| 35 |
+ We introduce MFT25, the first comprehensive multiple fish tracking dataset featuring 15 diverse video sequences with 408,578 meticulously annotated bounding boxes across 48,066 frames, capturing various underwater environments, fish species, and challenging conditions including occlusions, rapid direction changes, and visually similar appearances.
|