POMATO_Tracking / README.md
nielsr's picture
nielsr HF Staff
Improve dataset card: Add metadata, paper/code links, and dataset usage
ab75f63 verified
|
raw
history blame
2.1 kB
metadata
license: cc-by-nc-sa-4.0
task_categories:
  - image-to-3d
tags:
  - 3d-reconstruction
  - 3d-tracking
  - point-cloud
  - dynamic-scenes
  - depth-estimation
  - pose-estimation

POMATO: PointOdyssey Tracking Data

This repository contains the processed PointOdyssey tracking data used in the paper POMATO: Marrying Pointmap Matching with Temporal Motion for Dynamic 3D Reconstruction.

The POMATO framework proposes a unified approach for dynamic 3D reconstruction by marrying pointmap matching with temporal motion. It enables robust performance across various downstream tasks, including video depth estimation, 3D point tracking, and pose estimation. This dataset specifically supports the 3D point tracking evaluation mentioned in the paper.

Code

The official code repository for POMATO can be found at: https://github.com/wyddmw/POMA_eval

Dataset Usage

This repository provides the processed PointOdyssey data, specifically for tracking evaluation as mentioned in the official POMATO GitHub repository's Tracking section.

To download the data:

huggingface-cli download xiaochui/POMATO_Tracking po_seq.zip --local-dir ./data/tracking_eval_data

After downloading, unzip it to data/tracking_eval_data:

cd data/tracking_eval_data
unzip po_seq.zip

For more details on how to use this data for tracking experiments and evaluation, refer to the official POMATO GitHub repository's Tracking section.

Citation

If you find our POMATO work, including this dataset, useful in your research or applications, please consider citing using the following BibTeX:

@article{zhang2025pomato,
  title={POMATO: Marrying Pointmap Matching with Temporal Motion for Dynamic 3D Reconstruction},
  author={Zhang, Songyan and Ge, Yongtao and Tian, Jinyuan and Xu, Guangkai and Chen, Hao and Lv, Chen and Shen, Chunhua},
  journal={arXiv preprint arXiv:2504.05692},
  year={2025}
}