|
|
--- |
|
|
tags: |
|
|
- pixel-tracking |
|
|
- computer-vision |
|
|
license: mit |
|
|
library: pytorch |
|
|
inference: false |
|
|
--- |
|
|
|
|
|
# PIPs: Particle Video Revisited: Tracking Through Occlusions Using Point Trajectories |
|
|
|
|
|
* Model Authors: Adam W Harley and Zhaoyuan Fang and Katerina Fragkiadaki |
|
|
* Paper: Particle Video Revisited: Tracking Through Occlusions Using Point Trajectories (ECCV 2022 - https://arxiv.org/abs/2204.04153 |
|
|
* Code Repo: https://github.com/aharley/pips |
|
|
* Project Homepage: https://particle-video-revisited.github.io |
|
|
|
|
|
From the paper abstract: |
|
|
|
|
|
> [...] we revisit Sand and Teller's "particle video" approach, and study pixel tracking as a long-range motion estimation problem, where every pixel is described with a trajectory that locates it in multiple future frames. We re-build this classic approach using components that drive the current state-of-the-art in flow and object tracking, such as dense cost maps, iterative optimization, and learned appearance updates. We train our models using long-range amodal point trajectories mined from existing optical flow data that we synthetically augment with multi-frame occlusions. |
|
|
|
|
|
 |
|
|
|
|
|
# Citation |
|
|
|
|
|
``` |
|
|
@inproceedings{harley2022particle, |
|
|
title={Particle Video Revisited: Tracking Through Occlusions Using Point Trajectories}, |
|
|
author={Adam W Harley and Zhaoyuan Fang and Katerina Fragkiadaki}, |
|
|
booktitle={ECCV}, |
|
|
year={2022} |
|
|
} |
|
|
``` |
|
|
|